Compare commits
10 Commits
yuetsh
...
a1b51ebb9e
| Author | SHA1 | Date | |
|---|---|---|---|
| a1b51ebb9e | |||
| a9a6b87fef | |||
| 2d3588c755 | |||
| a2bfc28ac7 | |||
| 6aac767641 | |||
| 73af9d96b2 | |||
| 8a2fa11afc | |||
| 3f1c7250bd | |||
| bd0a7f30f8 | |||
| 8a043d2ffa |
10
.flake8
Normal file
10
.flake8
Normal file
@@ -0,0 +1,10 @@
|
||||
[flake8]
|
||||
exclude =
|
||||
xss_filter.py,
|
||||
*/migrations/,
|
||||
*settings.py
|
||||
*/apps.py
|
||||
venv/
|
||||
max-line-length = 180
|
||||
inline-quotes = "
|
||||
no-accept-encodings = True
|
||||
143
CLAUDE.md
143
CLAUDE.md
@@ -1,143 +0,0 @@
|
||||
# CLAUDE.md
|
||||
|
||||
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
|
||||
|
||||
## Project Overview
|
||||
|
||||
**OnlineJudge** is the backend for an Online Judge platform. Built with Django 5 + Django REST Framework, PostgreSQL, Redis, Django Channels (WebSocket), and Dramatiq (async task queue). Python 3.12+, managed with `uv`.
|
||||
|
||||
## Commands
|
||||
|
||||
```bash
|
||||
# Development
|
||||
python dev.py # Start dev server: Django on :8000 + Daphne WebSocket on :8001
|
||||
python manage.py runserver # HTTP only (no WebSocket support)
|
||||
python manage.py migrate # Apply database migrations
|
||||
python manage.py makemigrations # Create new migrations
|
||||
|
||||
# Dependencies
|
||||
uv sync # Install dependencies from uv.lock
|
||||
uv add <package> # Add a dependency
|
||||
|
||||
# Testing
|
||||
python manage.py test # Run all tests
|
||||
python manage.py test account # Run tests for a single app
|
||||
python manage.py test account.tests.TestClassName # Run a single test class
|
||||
python run_test.py # Run flake8 lint + coverage in one step
|
||||
python run_test.py -m account # Run flake8 + tests for a single module
|
||||
python run_test.py -c # Run flake8 + tests + open HTML coverage report
|
||||
|
||||
## Testing Policy
|
||||
|
||||
Do not write tests
|
||||
|
||||
# Initial setup
|
||||
python manage.py inituser --username admin --password <pw> --action create_super_admin
|
||||
python manage.py inituser --username admin --password <pw> --action reset
|
||||
```
|
||||
|
||||
## Architecture
|
||||
|
||||
### App Modules
|
||||
|
||||
Each Django app follows the same structure:
|
||||
```
|
||||
<app>/
|
||||
├── models.py # Django models
|
||||
├── serializers.py # DRF serializers
|
||||
├── views/
|
||||
│ ├── oj.py # User-facing API views
|
||||
│ └── admin.py # Admin API views
|
||||
└── urls/
|
||||
├── oj.py # User-facing URL patterns
|
||||
└── admin.py # Admin URL patterns
|
||||
```
|
||||
|
||||
Apps: `account`, `problem`, `submission`, `contest`, `ai`, `flowchart`, `problemset`, `class_pk`, `announcement`, `tutorial`, `message`, `comment`, `conf`, `options`, `judge`
|
||||
|
||||
`utils/` is itself a Django app (listed in `INSTALLED_APPS`) — not just a helpers package. It provides `RichTextField` (XSS-sanitized `TextField`), `APIError`, the base `APIView`, caching, WebSocket helpers, and the `inituser` management command. Import shared utilities from `utils.*`.
|
||||
|
||||
### URL Routing
|
||||
|
||||
All routes are registered in `oj/urls.py`:
|
||||
- `api/` — user-facing endpoints
|
||||
- `api/admin/` — admin-only endpoints
|
||||
|
||||
WebSocket routing is in `oj/routing.py`.
|
||||
|
||||
### Settings Structure
|
||||
|
||||
- `oj/settings.py` — base configuration (imports dev or production settings based on `OJ_ENV`)
|
||||
- `oj/dev_settings.py` — development overrides (imported when `OJ_ENV != "production"`)
|
||||
- `oj/production_settings.py` — production overrides
|
||||
|
||||
### Base APIView & View Patterns
|
||||
|
||||
`utils/api/api.py` provides the custom base classes and decorators used by **all** views:
|
||||
|
||||
- **`APIView`** — base class for all views (not DRF's `APIView`). Key methods:
|
||||
- `self.success(data)` — returns `{"error": null, "data": data}`
|
||||
- `self.error(msg)` — returns `{"error": "error", "data": msg}`
|
||||
- `self.paginate_data(request, query_set, serializer)` — offset/limit pagination
|
||||
- `self.invalid_serializer(serializer)` — standard validation error response
|
||||
- **`CSRFExemptAPIView`** — same as `APIView` but CSRF-exempt
|
||||
- **`@validate_serializer(SerializerClass)`** — decorator for view methods that validates `request.data` against a serializer before the method runs. On success, `request.data` is replaced with validated data.
|
||||
|
||||
Typical view method pattern:
|
||||
```python
|
||||
@validate_serializer(CreateProblemSerializer)
|
||||
@super_admin_required
|
||||
def post(self, request):
|
||||
# request.data is already validated
|
||||
return self.success(...)
|
||||
```
|
||||
|
||||
### Authentication & Permissions
|
||||
|
||||
`account/decorators.py` provides decorators used on view methods:
|
||||
- `@login_required` / `@admin_role_required` / `@super_admin_required`
|
||||
- `@problem_permission_required`
|
||||
- `@check_contest_permission(check_type)` — validates contest access, sets `self.contest`
|
||||
- `ensure_created_by(obj, user)` — helper that raises `APIError` if user doesn't own the object
|
||||
|
||||
### Judge System
|
||||
|
||||
- `judge/dispatcher.py` — dispatches submissions to the judge sandbox (JudgeServer)
|
||||
- `judge/tasks.py` — Dramatiq async tasks for judging
|
||||
- `judge/languages.py` — language configurations (compile/run commands, limits)
|
||||
|
||||
Judge status codes are defined in `submission/models.py` (`JudgeStatus` class, codes -2 to 8) and must match the frontend's `utils/constants.ts`.
|
||||
|
||||
### Site Configuration (SysOptions)
|
||||
|
||||
`options/options.py` provides `SysOptions` — a metaclass-based system for site-wide configuration stored in the database with thread-local caching. Access settings like `SysOptions.smtp_config`, `SysOptions.languages`, etc.
|
||||
|
||||
### WebSocket (Channels)
|
||||
|
||||
`submission/consumers.py` — WebSocket consumer for real-time submission status updates. Uses `channels-redis` as the channel layer backend. Push updates via `utils/websocket.py:push_submission_update()`.
|
||||
|
||||
### Caching
|
||||
|
||||
Redis-backed via `django-redis`. Cache keys use MD5 hashing for consistency. See `utils/cache.py`.
|
||||
|
||||
### AI Integration
|
||||
|
||||
`utils/openai.py` — OpenAI client wrapper configured to work with OpenAI-compatible APIs (e.g., DeepSeek). Used by `ai/` app for submission analysis.
|
||||
|
||||
### Data Directory
|
||||
|
||||
Test cases and submission outputs are stored in a separate data directory (configured in settings, not in the repo). The `data/` directory in the repo contains configuration templates and `secret.key`.
|
||||
|
||||
## Key Domain Concepts
|
||||
|
||||
| Concept | Details |
|
||||
|---|---|
|
||||
| Problem types | ACM (binary accept/reject) vs OI (partial scoring) |
|
||||
| Judge statuses | COMPILE_ERROR(-2), WRONG_ANSWER(-1), ACCEPTED(0), CPU_TLE(1), REAL_TLE(2), MLE(3), RE(4), SE(5), PENDING(6), JUDGING(7), PARTIALLY_ACCEPTED(8) |
|
||||
| User roles | Regular / Admin / Super Admin |
|
||||
| Contest types | Public vs Password Protected |
|
||||
| Supported languages | C, C++, Python2, Python3, Java, JavaScript, Golang, Flowchart |
|
||||
|
||||
## Related Repository
|
||||
|
||||
The frontend is at `../ojnext` — a Vue 3 + Rsbuild project. See its CLAUDE.md for frontend details.
|
||||
26
Dockerfile
26
Dockerfile
@@ -2,26 +2,21 @@ FROM python:3.12.2-alpine
|
||||
ARG TARGETARCH
|
||||
ARG TARGETVARIANT
|
||||
|
||||
RUN sed -i 's|dl-cdn.alpinelinux.org|mirrors.tuna.tsinghua.edu.cn|g' /etc/apk/repositories
|
||||
RUN sed -i 's/dl-cdn.alpinelinux.org/mirrors.ustc.edu.cn/g' /etc/apk/repositories
|
||||
|
||||
ENV OJ_ENV production
|
||||
WORKDIR /app
|
||||
|
||||
COPY ./deploy/requirements.txt /app/deploy/
|
||||
|
||||
RUN --mount=type=cache,target=/var/cache/apk,id=apk-cache-$TARGETARCH$TARGETVARIANT-final \
|
||||
--mount=type=cache,target=/root/.cache/pip,id=pip-cache-$TARGETARCH$TARGETVARIANT-final \
|
||||
# psycopg2: libpg-dev
|
||||
# pillow: libjpeg-turbo-dev zlib-dev freetype-dev
|
||||
RUN --mount=type=cache,target=/etc/apk/cache,id=apk-cahce-$TARGETARCH$TARGETVARIANT-final \
|
||||
--mount=type=cache,target=/root/.cache/pip,id=pip-cahce-$TARGETARCH$TARGETVARIANT-final \
|
||||
<<EOS
|
||||
set -ex
|
||||
pip config set global.index-url https://pypi.tuna.tsinghua.edu.cn/simple
|
||||
apk add --no-cache \
|
||||
gcc libc-dev python3-dev \
|
||||
libpq libpq-dev \
|
||||
libjpeg-turbo libjpeg-turbo-dev \
|
||||
zlib zlib-dev \
|
||||
freetype freetype-dev \
|
||||
supervisor openssl nginx curl unzip
|
||||
pip install --no-cache-dir -r /app/deploy/requirements.txt
|
||||
pip config set global.index-url https://mirrors.ustc.edu.cn/pypi/web/simple
|
||||
apk add gcc libc-dev python3-dev libpq libpq-dev libjpeg-turbo libjpeg-turbo-dev zlib zlib-dev freetype freetype-dev supervisor openssl nginx curl unzip
|
||||
pip install -r /app/deploy/requirements.txt
|
||||
apk del gcc libc-dev python3-dev libpq-dev libjpeg-turbo-dev zlib-dev freetype-dev
|
||||
EOS
|
||||
|
||||
@@ -29,7 +24,6 @@ COPY ./ /app/
|
||||
RUN mkdir -p /app/dist/
|
||||
RUN chmod -R u=rwX,go=rX ./ && chmod +x ./deploy/entrypoint.sh
|
||||
|
||||
HEALTHCHECK --interval=30s --timeout=3s --start-period=5s --retries=3 \
|
||||
CMD python3 /app/deploy/health_check.py
|
||||
HEALTHCHECK --interval=5s CMD [ "/usr/local/bin/python3", "/app/deploy/health_check.py" ]
|
||||
EXPOSE 8000
|
||||
ENTRYPOINT [ "/app/deploy/entrypoint.sh" ]
|
||||
ENTRYPOINT [ "/app/deploy/entrypoint.sh" ]
|
||||
|
||||
@@ -2,11 +2,10 @@ import functools
|
||||
import hashlib
|
||||
import time
|
||||
|
||||
from contest.models import Contest, ContestRuleType, ContestStatus, ContestType
|
||||
from problem.models import Problem
|
||||
from utils.api import APIError, JSONResponse
|
||||
from contest.models import Contest, ContestType, ContestStatus, ContestRuleType
|
||||
from utils.api import JSONResponse, APIError
|
||||
from utils.constants import CONTEST_PASSWORD_SESSION_KEY
|
||||
|
||||
from .models import ProblemPermission
|
||||
|
||||
|
||||
|
||||
@@ -1,10 +1,10 @@
|
||||
from django.conf import settings
|
||||
from django.db import connection
|
||||
from django.utils.deprecation import MiddlewareMixin
|
||||
from django.utils.timezone import now
|
||||
from django.utils.deprecation import MiddlewareMixin
|
||||
|
||||
from account.models import User
|
||||
from utils.api import JSONResponse
|
||||
from account.models import User
|
||||
|
||||
|
||||
class APITokenAuthMiddleware(MiddlewareMixin):
|
||||
|
||||
@@ -1,18 +0,0 @@
|
||||
# Generated by Django 5.2.3 on 2025-09-19 06:11
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('account', '0001_initial'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='userprofile',
|
||||
name='class_name',
|
||||
field=models.TextField(null=True),
|
||||
),
|
||||
]
|
||||
@@ -1,22 +0,0 @@
|
||||
# Generated by Django 5.2.3 on 2025-09-19 06:14
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('account', '0002_userprofile_class_name'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.RemoveField(
|
||||
model_name='userprofile',
|
||||
name='class_name',
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='user',
|
||||
name='class_name',
|
||||
field=models.TextField(null=True),
|
||||
),
|
||||
]
|
||||
@@ -1,7 +1,6 @@
|
||||
from django.conf import settings
|
||||
from django.contrib.auth.models import AbstractBaseUser
|
||||
from django.conf import settings
|
||||
from django.db import models
|
||||
|
||||
from utils.models import JSONField
|
||||
|
||||
|
||||
@@ -26,7 +25,6 @@ class UserManager(models.Manager):
|
||||
|
||||
class User(AbstractBaseUser):
|
||||
username = models.TextField(unique=True)
|
||||
class_name = models.TextField(null=True)
|
||||
email = models.TextField(null=True)
|
||||
create_time = models.DateTimeField(auto_now_add=True, null=True)
|
||||
# One of UserType
|
||||
@@ -52,9 +50,6 @@ class User(AbstractBaseUser):
|
||||
|
||||
objects = UserManager()
|
||||
|
||||
def is_regular_user(self):
|
||||
return self.admin_type == AdminType.REGULAR_USER
|
||||
|
||||
def is_admin(self):
|
||||
return self.admin_type == AdminType.ADMIN
|
||||
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
from django import forms
|
||||
|
||||
from utils.api import UsernameSerializer, serializers
|
||||
from utils.api import serializers, UsernameSerializer
|
||||
|
||||
from .models import AdminType, ProblemPermission, User, UserProfile
|
||||
|
||||
@@ -67,7 +67,6 @@ class UserAdminSerializer(serializers.ModelSerializer):
|
||||
"open_api",
|
||||
"is_disabled",
|
||||
"raw_password",
|
||||
"class_name",
|
||||
]
|
||||
|
||||
def get_real_name(self, obj):
|
||||
@@ -94,7 +93,6 @@ class UserSerializer(serializers.ModelSerializer):
|
||||
"two_factor_auth",
|
||||
"open_api",
|
||||
"is_disabled",
|
||||
"class_name",
|
||||
]
|
||||
|
||||
|
||||
@@ -131,7 +129,7 @@ class EditUserSerializer(serializers.Serializer):
|
||||
open_api = serializers.BooleanField()
|
||||
two_factor_auth = serializers.BooleanField()
|
||||
is_disabled = serializers.BooleanField()
|
||||
class_name = serializers.CharField(required=False, allow_null=True, allow_blank=True)
|
||||
|
||||
|
||||
class EditUserProfileSerializer(serializers.Serializer):
|
||||
real_name = serializers.CharField(max_length=32, allow_null=True, required=False)
|
||||
@@ -143,6 +141,7 @@ class EditUserProfileSerializer(serializers.Serializer):
|
||||
major = serializers.CharField(max_length=64, allow_blank=True, required=False)
|
||||
language = serializers.CharField(max_length=32, allow_blank=True, required=False)
|
||||
|
||||
|
||||
class ApplyResetPasswordSerializer(serializers.Serializer):
|
||||
email = serializers.EmailField()
|
||||
captcha = serializers.CharField()
|
||||
|
||||
@@ -1,9 +1,8 @@
|
||||
import logging
|
||||
|
||||
import dramatiq
|
||||
|
||||
from options.options import SysOptions
|
||||
from utils.shortcuts import DRAMATIQ_WORKER_ARGS, send_email
|
||||
from utils.shortcuts import send_email, DRAMATIQ_WORKER_ARGS
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
646
account/tests.py
Normal file
646
account/tests.py
Normal file
@@ -0,0 +1,646 @@
|
||||
import time
|
||||
|
||||
from unittest import mock
|
||||
from datetime import timedelta
|
||||
from copy import deepcopy
|
||||
|
||||
from django.contrib import auth
|
||||
from django.utils.timezone import now
|
||||
from otpauth import OtpAuth
|
||||
|
||||
from utils.api.tests import APIClient, APITestCase
|
||||
from utils.shortcuts import rand_str
|
||||
from options.options import SysOptions
|
||||
|
||||
from .models import AdminType, ProblemPermission, User
|
||||
from utils.constants import ContestRuleType
|
||||
|
||||
|
||||
class PermissionDecoratorTest(APITestCase):
|
||||
def setUp(self):
|
||||
self.regular_user = User.objects.create(username="regular_user")
|
||||
self.admin = User.objects.create(username="admin")
|
||||
self.super_admin = User.objects.create(username="super_admin")
|
||||
self.request = mock.MagicMock()
|
||||
self.request.user.is_authenticated = mock.MagicMock()
|
||||
|
||||
def test_login_required(self):
|
||||
self.request.user.is_authenticated.return_value = False
|
||||
|
||||
def test_admin_required(self):
|
||||
pass
|
||||
|
||||
def test_super_admin_required(self):
|
||||
pass
|
||||
|
||||
|
||||
class DuplicateUserCheckAPITest(APITestCase):
|
||||
def setUp(self):
|
||||
user = self.create_user("test", "test123", login=False)
|
||||
user.email = "test@test.com"
|
||||
user.save()
|
||||
self.url = self.reverse("check_username_or_email")
|
||||
|
||||
def test_duplicate_username(self):
|
||||
resp = self.client.post(self.url, data={"username": "test"})
|
||||
data = resp.data["data"]
|
||||
self.assertEqual(data["username"], True)
|
||||
resp = self.client.post(self.url, data={"username": "Test"})
|
||||
self.assertEqual(resp.data["data"]["username"], True)
|
||||
|
||||
def test_ok_username(self):
|
||||
resp = self.client.post(self.url, data={"username": "test1"})
|
||||
data = resp.data["data"]
|
||||
self.assertFalse(data["username"])
|
||||
|
||||
def test_duplicate_email(self):
|
||||
resp = self.client.post(self.url, data={"email": "test@test.com"})
|
||||
self.assertEqual(resp.data["data"]["email"], True)
|
||||
resp = self.client.post(self.url, data={"email": "Test@Test.com"})
|
||||
self.assertTrue(resp.data["data"]["email"])
|
||||
|
||||
def test_ok_email(self):
|
||||
resp = self.client.post(self.url, data={"email": "aa@test.com"})
|
||||
self.assertFalse(resp.data["data"]["email"])
|
||||
|
||||
|
||||
class TFARequiredCheckAPITest(APITestCase):
|
||||
def setUp(self):
|
||||
self.url = self.reverse("tfa_required_check")
|
||||
self.create_user("test", "test123", login=False)
|
||||
|
||||
def test_not_required_tfa(self):
|
||||
resp = self.client.post(self.url, data={"username": "test"})
|
||||
self.assertSuccess(resp)
|
||||
self.assertEqual(resp.data["data"]["result"], False)
|
||||
|
||||
def test_required_tfa(self):
|
||||
user = User.objects.first()
|
||||
user.two_factor_auth = True
|
||||
user.save()
|
||||
resp = self.client.post(self.url, data={"username": "test"})
|
||||
self.assertEqual(resp.data["data"]["result"], True)
|
||||
|
||||
|
||||
class UserLoginAPITest(APITestCase):
|
||||
def setUp(self):
|
||||
self.username = self.password = "test"
|
||||
self.user = self.create_user(username=self.username, password=self.password, login=False)
|
||||
self.login_url = self.reverse("user_login_api")
|
||||
|
||||
def _set_tfa(self):
|
||||
self.user.two_factor_auth = True
|
||||
tfa_token = rand_str(32)
|
||||
self.user.tfa_token = tfa_token
|
||||
self.user.save()
|
||||
return tfa_token
|
||||
|
||||
def test_login_with_correct_info(self):
|
||||
response = self.client.post(self.login_url,
|
||||
data={"username": self.username, "password": self.password})
|
||||
self.assertDictEqual(response.data, {"error": None, "data": "Succeeded"})
|
||||
|
||||
user = auth.get_user(self.client)
|
||||
self.assertTrue(user.is_authenticated)
|
||||
|
||||
def test_login_with_correct_info_upper_username(self):
|
||||
resp = self.client.post(self.login_url, data={"username": self.username.upper(), "password": self.password})
|
||||
self.assertDictEqual(resp.data, {"error": None, "data": "Succeeded"})
|
||||
user = auth.get_user(self.client)
|
||||
self.assertTrue(user.is_authenticated)
|
||||
|
||||
def test_login_with_wrong_info(self):
|
||||
response = self.client.post(self.login_url,
|
||||
data={"username": self.username, "password": "invalid_password"})
|
||||
self.assertDictEqual(response.data, {"error": "error", "data": "Invalid username or password"})
|
||||
|
||||
user = auth.get_user(self.client)
|
||||
self.assertFalse(user.is_authenticated)
|
||||
|
||||
def test_tfa_login(self):
|
||||
token = self._set_tfa()
|
||||
code = OtpAuth(token).totp()
|
||||
if len(str(code)) < 6:
|
||||
code = (6 - len(str(code))) * "0" + str(code)
|
||||
response = self.client.post(self.login_url,
|
||||
data={"username": self.username,
|
||||
"password": self.password,
|
||||
"tfa_code": code})
|
||||
self.assertDictEqual(response.data, {"error": None, "data": "Succeeded"})
|
||||
|
||||
user = auth.get_user(self.client)
|
||||
self.assertTrue(user.is_authenticated)
|
||||
|
||||
def test_tfa_login_wrong_code(self):
|
||||
self._set_tfa()
|
||||
response = self.client.post(self.login_url,
|
||||
data={"username": self.username,
|
||||
"password": self.password,
|
||||
"tfa_code": "qqqqqq"})
|
||||
self.assertDictEqual(response.data, {"error": "error", "data": "Invalid two factor verification code"})
|
||||
|
||||
user = auth.get_user(self.client)
|
||||
self.assertFalse(user.is_authenticated)
|
||||
|
||||
def test_tfa_login_without_code(self):
|
||||
self._set_tfa()
|
||||
response = self.client.post(self.login_url,
|
||||
data={"username": self.username,
|
||||
"password": self.password})
|
||||
self.assertDictEqual(response.data, {"error": "error", "data": "tfa_required"})
|
||||
|
||||
user = auth.get_user(self.client)
|
||||
self.assertFalse(user.is_authenticated)
|
||||
|
||||
def test_user_disabled(self):
|
||||
self.user.is_disabled = True
|
||||
self.user.save()
|
||||
resp = self.client.post(self.login_url, data={"username": self.username,
|
||||
"password": self.password})
|
||||
self.assertDictEqual(resp.data, {"error": "error", "data": "Your account has been disabled"})
|
||||
|
||||
|
||||
class CaptchaTest(APITestCase):
|
||||
def _set_captcha(self, session):
|
||||
captcha = rand_str(4)
|
||||
session["_django_captcha_key"] = captcha
|
||||
session["_django_captcha_expires_time"] = int(time.time()) + 30
|
||||
session.save()
|
||||
return captcha
|
||||
|
||||
|
||||
class UserRegisterAPITest(CaptchaTest):
|
||||
def setUp(self):
|
||||
self.client = APIClient()
|
||||
self.register_url = self.reverse("user_register_api")
|
||||
self.captcha = rand_str(4)
|
||||
|
||||
self.data = {"username": "test_user", "password": "testuserpassword",
|
||||
"real_name": "real_name", "email": "test@qduoj.com",
|
||||
"captcha": self._set_captcha(self.client.session)}
|
||||
|
||||
def test_website_config_limit(self):
|
||||
SysOptions.allow_register = False
|
||||
resp = self.client.post(self.register_url, data=self.data)
|
||||
self.assertDictEqual(resp.data, {"error": "error", "data": "Register function has been disabled by admin"})
|
||||
|
||||
def test_invalid_captcha(self):
|
||||
self.data["captcha"] = "****"
|
||||
response = self.client.post(self.register_url, data=self.data)
|
||||
self.assertDictEqual(response.data, {"error": "error", "data": "Invalid captcha"})
|
||||
|
||||
self.data.pop("captcha")
|
||||
response = self.client.post(self.register_url, data=self.data)
|
||||
self.assertTrue(response.data["error"] is not None)
|
||||
|
||||
def test_register_with_correct_info(self):
|
||||
response = self.client.post(self.register_url, data=self.data)
|
||||
self.assertDictEqual(response.data, {"error": None, "data": "Succeeded"})
|
||||
|
||||
def test_username_already_exists(self):
|
||||
self.test_register_with_correct_info()
|
||||
|
||||
self.data["captcha"] = self._set_captcha(self.client.session)
|
||||
self.data["email"] = "test1@qduoj.com"
|
||||
response = self.client.post(self.register_url, data=self.data)
|
||||
self.assertDictEqual(response.data, {"error": "error", "data": "Username already exists"})
|
||||
|
||||
def test_email_already_exists(self):
|
||||
self.test_register_with_correct_info()
|
||||
|
||||
self.data["captcha"] = self._set_captcha(self.client.session)
|
||||
self.data["username"] = "test_user1"
|
||||
response = self.client.post(self.register_url, data=self.data)
|
||||
self.assertDictEqual(response.data, {"error": "error", "data": "Email already exists"})
|
||||
|
||||
|
||||
class SessionManagementAPITest(APITestCase):
|
||||
def setUp(self):
|
||||
self.create_user("test", "test123")
|
||||
self.url = self.reverse("session_management_api")
|
||||
# launch a request to provide session data
|
||||
login_url = self.reverse("user_login_api")
|
||||
self.client.post(login_url, data={"username": "test", "password": "test123"})
|
||||
|
||||
def test_get_sessions(self):
|
||||
resp = self.client.get(self.url)
|
||||
self.assertSuccess(resp)
|
||||
data = resp.data["data"]
|
||||
self.assertEqual(len(data), 1)
|
||||
|
||||
# def test_delete_session_key(self):
|
||||
# resp = self.client.delete(self.url + "?session_key=" + self.session_key)
|
||||
# self.assertSuccess(resp)
|
||||
|
||||
def test_delete_session_with_invalid_key(self):
|
||||
resp = self.client.delete(self.url + "?session_key=aaaaaaaaaa")
|
||||
self.assertDictEqual(resp.data, {"error": "error", "data": "Invalid session_key"})
|
||||
|
||||
|
||||
class UserProfileAPITest(APITestCase):
|
||||
def setUp(self):
|
||||
self.url = self.reverse("user_profile_api")
|
||||
|
||||
def test_get_profile_without_login(self):
|
||||
resp = self.client.get(self.url)
|
||||
self.assertDictEqual(resp.data, {"error": None, "data": None})
|
||||
|
||||
def test_get_profile(self):
|
||||
self.create_user("test", "test123")
|
||||
resp = self.client.get(self.url)
|
||||
self.assertSuccess(resp)
|
||||
|
||||
def test_update_profile(self):
|
||||
self.create_user("test", "test123")
|
||||
update_data = {"real_name": "zemal", "submission_number": 233, "language": "en-US"}
|
||||
resp = self.client.put(self.url, data=update_data)
|
||||
self.assertSuccess(resp)
|
||||
data = resp.data["data"]
|
||||
self.assertEqual(data["real_name"], "zemal")
|
||||
self.assertEqual(data["submission_number"], 0)
|
||||
self.assertEqual(data["language"], "en-US")
|
||||
|
||||
|
||||
class TwoFactorAuthAPITest(APITestCase):
|
||||
def setUp(self):
|
||||
self.url = self.reverse("two_factor_auth_api")
|
||||
self.create_user("test", "test123")
|
||||
|
||||
def _get_tfa_code(self):
|
||||
user = User.objects.first()
|
||||
code = OtpAuth(user.tfa_token).totp()
|
||||
if len(str(code)) < 6:
|
||||
code = (6 - len(str(code))) * "0" + str(code)
|
||||
return code
|
||||
|
||||
def test_get_image(self):
|
||||
resp = self.client.get(self.url)
|
||||
self.assertSuccess(resp)
|
||||
|
||||
def test_open_tfa_with_invalid_code(self):
|
||||
self.test_get_image()
|
||||
resp = self.client.post(self.url, data={"code": "000000"})
|
||||
self.assertDictEqual(resp.data, {"error": "error", "data": "Invalid code"})
|
||||
|
||||
def test_open_tfa_with_correct_code(self):
|
||||
self.test_get_image()
|
||||
code = self._get_tfa_code()
|
||||
resp = self.client.post(self.url, data={"code": code})
|
||||
self.assertSuccess(resp)
|
||||
user = User.objects.first()
|
||||
self.assertEqual(user.two_factor_auth, True)
|
||||
|
||||
def test_close_tfa_with_invalid_code(self):
|
||||
self.test_open_tfa_with_correct_code()
|
||||
resp = self.client.post(self.url, data={"code": "000000"})
|
||||
self.assertDictEqual(resp.data, {"error": "error", "data": "Invalid code"})
|
||||
|
||||
def test_close_tfa_with_correct_code(self):
|
||||
self.test_open_tfa_with_correct_code()
|
||||
code = self._get_tfa_code()
|
||||
resp = self.client.put(self.url, data={"code": code})
|
||||
self.assertSuccess(resp)
|
||||
user = User.objects.first()
|
||||
self.assertEqual(user.two_factor_auth, False)
|
||||
|
||||
|
||||
@mock.patch("account.views.oj.send_email_async.send")
|
||||
class ApplyResetPasswordAPITest(CaptchaTest):
|
||||
def setUp(self):
|
||||
self.create_user("test", "test123", login=False)
|
||||
user = User.objects.first()
|
||||
user.email = "test@oj.com"
|
||||
user.save()
|
||||
self.url = self.reverse("apply_reset_password_api")
|
||||
self.data = {"email": "test@oj.com", "captcha": self._set_captcha(self.client.session)}
|
||||
|
||||
def _refresh_captcha(self):
|
||||
self.data["captcha"] = self._set_captcha(self.client.session)
|
||||
|
||||
def test_apply_reset_password(self, send_email_send):
|
||||
resp = self.client.post(self.url, data=self.data)
|
||||
self.assertSuccess(resp)
|
||||
send_email_send.assert_called()
|
||||
|
||||
def test_apply_reset_password_twice_in_20_mins(self, send_email_send):
|
||||
self.test_apply_reset_password()
|
||||
send_email_send.reset_mock()
|
||||
self._refresh_captcha()
|
||||
resp = self.client.post(self.url, data=self.data)
|
||||
self.assertDictEqual(resp.data, {"error": "error", "data": "You can only reset password once per 20 minutes"})
|
||||
send_email_send.assert_not_called()
|
||||
|
||||
def test_apply_reset_password_again_after_20_mins(self, send_email_send):
|
||||
self.test_apply_reset_password()
|
||||
user = User.objects.first()
|
||||
user.reset_password_token_expire_time = now() - timedelta(minutes=21)
|
||||
user.save()
|
||||
self._refresh_captcha()
|
||||
self.test_apply_reset_password()
|
||||
|
||||
|
||||
class ResetPasswordAPITest(CaptchaTest):
|
||||
def setUp(self):
|
||||
self.create_user("test", "test123", login=False)
|
||||
self.url = self.reverse("reset_password_api")
|
||||
user = User.objects.first()
|
||||
user.reset_password_token = "online_judge?"
|
||||
user.reset_password_token_expire_time = now() + timedelta(minutes=20)
|
||||
user.save()
|
||||
self.data = {"token": user.reset_password_token,
|
||||
"captcha": self._set_captcha(self.client.session),
|
||||
"password": "test456"}
|
||||
|
||||
def test_reset_password_with_correct_token(self):
|
||||
resp = self.client.post(self.url, data=self.data)
|
||||
self.assertSuccess(resp)
|
||||
self.assertTrue(self.client.login(username="test", password="test456"))
|
||||
|
||||
def test_reset_password_with_invalid_token(self):
|
||||
self.data["token"] = "aaaaaaaaaaa"
|
||||
resp = self.client.post(self.url, data=self.data)
|
||||
self.assertDictEqual(resp.data, {"error": "error", "data": "Token does not exist"})
|
||||
|
||||
def test_reset_password_with_expired_token(self):
|
||||
user = User.objects.first()
|
||||
user.reset_password_token_expire_time = now() - timedelta(seconds=30)
|
||||
user.save()
|
||||
resp = self.client.post(self.url, data=self.data)
|
||||
self.assertDictEqual(resp.data, {"error": "error", "data": "Token has expired"})
|
||||
|
||||
|
||||
class UserChangeEmailAPITest(APITestCase):
|
||||
def setUp(self):
|
||||
self.url = self.reverse("user_change_email_api")
|
||||
self.user = self.create_user("test", "test123")
|
||||
self.new_mail = "test@oj.com"
|
||||
self.data = {"password": "test123", "new_email": self.new_mail}
|
||||
|
||||
def test_change_email_success(self):
|
||||
resp = self.client.post(self.url, data=self.data)
|
||||
self.assertSuccess(resp)
|
||||
|
||||
def test_wrong_password(self):
|
||||
self.data["password"] = "aaaa"
|
||||
resp = self.client.post(self.url, data=self.data)
|
||||
self.assertDictEqual(resp.data, {"error": "error", "data": "Wrong password"})
|
||||
|
||||
def test_duplicate_email(self):
|
||||
u = self.create_user("aa", "bb", login=False)
|
||||
u.email = self.new_mail
|
||||
u.save()
|
||||
resp = self.client.post(self.url, data=self.data)
|
||||
self.assertDictEqual(resp.data, {"error": "error", "data": "The email is owned by other account"})
|
||||
|
||||
|
||||
class UserChangePasswordAPITest(APITestCase):
|
||||
def setUp(self):
|
||||
self.url = self.reverse("user_change_password_api")
|
||||
|
||||
# Create user at first
|
||||
self.username = "test_user"
|
||||
self.old_password = "testuserpassword"
|
||||
self.new_password = "new_password"
|
||||
self.user = self.create_user(username=self.username, password=self.old_password, login=False)
|
||||
|
||||
self.data = {"old_password": self.old_password, "new_password": self.new_password}
|
||||
|
||||
def _get_tfa_code(self):
|
||||
user = User.objects.first()
|
||||
code = OtpAuth(user.tfa_token).totp()
|
||||
if len(str(code)) < 6:
|
||||
code = (6 - len(str(code))) * "0" + str(code)
|
||||
return code
|
||||
|
||||
def test_login_required(self):
|
||||
response = self.client.post(self.url, data=self.data)
|
||||
self.assertEqual(response.data, {"error": "permission-denied", "data": "Please login first"})
|
||||
|
||||
def test_valid_ola_password(self):
|
||||
self.assertTrue(self.client.login(username=self.username, password=self.old_password))
|
||||
response = self.client.post(self.url, data=self.data)
|
||||
self.assertEqual(response.data, {"error": None, "data": "Succeeded"})
|
||||
self.assertTrue(self.client.login(username=self.username, password=self.new_password))
|
||||
|
||||
def test_invalid_old_password(self):
|
||||
self.assertTrue(self.client.login(username=self.username, password=self.old_password))
|
||||
self.data["old_password"] = "invalid"
|
||||
response = self.client.post(self.url, data=self.data)
|
||||
self.assertEqual(response.data, {"error": "error", "data": "Invalid old password"})
|
||||
|
||||
def test_tfa_code_required(self):
|
||||
self.user.two_factor_auth = True
|
||||
self.user.tfa_token = "tfa_token"
|
||||
self.user.save()
|
||||
self.assertTrue(self.client.login(username=self.username, password=self.old_password))
|
||||
self.data["tfa_code"] = rand_str(6)
|
||||
resp = self.client.post(self.url, data=self.data)
|
||||
self.assertEqual(resp.data, {"error": "error", "data": "Invalid two factor verification code"})
|
||||
|
||||
self.data["tfa_code"] = self._get_tfa_code()
|
||||
resp = self.client.post(self.url, data=self.data)
|
||||
self.assertSuccess(resp)
|
||||
|
||||
|
||||
class UserRankAPITest(APITestCase):
|
||||
def setUp(self):
|
||||
self.url = self.reverse("user_rank_api")
|
||||
self.create_user("test1", "test123", login=False)
|
||||
self.create_user("test2", "test123", login=False)
|
||||
test1 = User.objects.get(username="test1")
|
||||
profile1 = test1.userprofile
|
||||
profile1.submission_number = 10
|
||||
profile1.accepted_number = 10
|
||||
profile1.total_score = 240
|
||||
profile1.save()
|
||||
|
||||
test2 = User.objects.get(username="test2")
|
||||
profile2 = test2.userprofile
|
||||
profile2.submission_number = 15
|
||||
profile2.accepted_number = 10
|
||||
profile2.total_score = 700
|
||||
profile2.save()
|
||||
|
||||
def test_get_acm_rank(self):
|
||||
resp = self.client.get(self.url, data={"rule": ContestRuleType.ACM})
|
||||
self.assertSuccess(resp)
|
||||
data = resp.data["data"]["results"]
|
||||
self.assertEqual(data[0]["user"]["username"], "test1")
|
||||
self.assertEqual(data[1]["user"]["username"], "test2")
|
||||
|
||||
def test_get_oi_rank(self):
|
||||
resp = self.client.get(self.url, data={"rule": ContestRuleType.OI})
|
||||
self.assertSuccess(resp)
|
||||
data = resp.data["data"]["results"]
|
||||
self.assertEqual(data[0]["user"]["username"], "test2")
|
||||
self.assertEqual(data[1]["user"]["username"], "test1")
|
||||
|
||||
def test_admin_role_filted(self):
|
||||
self.create_admin("admin", "admin123")
|
||||
admin = User.objects.get(username="admin")
|
||||
profile1 = admin.userprofile
|
||||
profile1.submission_number = 20
|
||||
profile1.accepted_number = 5
|
||||
profile1.total_score = 300
|
||||
profile1.save()
|
||||
resp = self.client.get(self.url, data={"rule": ContestRuleType.ACM})
|
||||
self.assertSuccess(resp)
|
||||
self.assertEqual(len(resp.data["data"]), 2)
|
||||
|
||||
resp = self.client.get(self.url, data={"rule": ContestRuleType.OI})
|
||||
self.assertSuccess(resp)
|
||||
self.assertEqual(len(resp.data["data"]), 2)
|
||||
|
||||
|
||||
class ProfileProblemDisplayIDRefreshAPITest(APITestCase):
|
||||
def setUp(self):
|
||||
pass
|
||||
|
||||
|
||||
class AdminUserTest(APITestCase):
|
||||
def setUp(self):
|
||||
self.user = self.create_super_admin(login=True)
|
||||
self.username = self.password = "test"
|
||||
self.regular_user = self.create_user(username=self.username, password=self.password, login=False)
|
||||
self.url = self.reverse("user_admin_api")
|
||||
self.data = {"id": self.regular_user.id, "username": self.username, "real_name": "test_name",
|
||||
"email": "test@qq.com", "admin_type": AdminType.REGULAR_USER,
|
||||
"problem_permission": ProblemPermission.OWN, "open_api": True,
|
||||
"two_factor_auth": False, "is_disabled": False}
|
||||
|
||||
def test_user_list(self):
|
||||
response = self.client.get(self.url)
|
||||
self.assertSuccess(response)
|
||||
|
||||
def test_edit_user_successfully(self):
|
||||
response = self.client.put(self.url, data=self.data)
|
||||
self.assertSuccess(response)
|
||||
resp_data = response.data["data"]
|
||||
self.assertEqual(resp_data["username"], self.username)
|
||||
self.assertEqual(resp_data["email"], "test@qq.com")
|
||||
self.assertEqual(resp_data["open_api"], True)
|
||||
self.assertEqual(resp_data["two_factor_auth"], False)
|
||||
self.assertEqual(resp_data["is_disabled"], False)
|
||||
self.assertEqual(resp_data["problem_permission"], ProblemPermission.NONE)
|
||||
|
||||
self.assertTrue(self.regular_user.check_password("test"))
|
||||
|
||||
def test_edit_user_password(self):
|
||||
data = self.data
|
||||
new_password = "testpassword"
|
||||
data["password"] = new_password
|
||||
response = self.client.put(self.url, data=data)
|
||||
self.assertSuccess(response)
|
||||
user = User.objects.get(id=self.regular_user.id)
|
||||
self.assertFalse(user.check_password(self.password))
|
||||
self.assertTrue(user.check_password(new_password))
|
||||
|
||||
def test_edit_user_tfa(self):
|
||||
data = self.data
|
||||
self.assertIsNone(self.regular_user.tfa_token)
|
||||
data["two_factor_auth"] = True
|
||||
response = self.client.put(self.url, data=data)
|
||||
self.assertSuccess(response)
|
||||
resp_data = response.data["data"]
|
||||
# if `tfa_token` is None, a new value will be generated
|
||||
self.assertTrue(resp_data["two_factor_auth"])
|
||||
token = User.objects.get(id=self.regular_user.id).tfa_token
|
||||
self.assertIsNotNone(token)
|
||||
|
||||
response = self.client.put(self.url, data=data)
|
||||
self.assertSuccess(response)
|
||||
resp_data = response.data["data"]
|
||||
# if `tfa_token` is not None, the value is not changed
|
||||
self.assertTrue(resp_data["two_factor_auth"])
|
||||
self.assertEqual(User.objects.get(id=self.regular_user.id).tfa_token, token)
|
||||
|
||||
def test_edit_user_openapi(self):
|
||||
data = self.data
|
||||
self.assertIsNone(self.regular_user.open_api_appkey)
|
||||
data["open_api"] = True
|
||||
response = self.client.put(self.url, data=data)
|
||||
self.assertSuccess(response)
|
||||
resp_data = response.data["data"]
|
||||
# if `open_api_appkey` is None, a new value will be generated
|
||||
self.assertTrue(resp_data["open_api"])
|
||||
key = User.objects.get(id=self.regular_user.id).open_api_appkey
|
||||
self.assertIsNotNone(key)
|
||||
|
||||
response = self.client.put(self.url, data=data)
|
||||
self.assertSuccess(response)
|
||||
resp_data = response.data["data"]
|
||||
# if `openapi_app_key` is not None, the value is not changed
|
||||
self.assertTrue(resp_data["open_api"])
|
||||
self.assertEqual(User.objects.get(id=self.regular_user.id).open_api_appkey, key)
|
||||
|
||||
def test_import_users(self):
|
||||
data = {"users": [["user1", "pass1", "eami1@e.com", "user1"],
|
||||
["user2", "pass3", "eamil3@e.com", "user2"]]
|
||||
}
|
||||
resp = self.client.post(self.url, data)
|
||||
self.assertSuccess(resp)
|
||||
# successfully created 2 users
|
||||
self.assertEqual(User.objects.all().count(), 4)
|
||||
|
||||
def test_import_duplicate_user(self):
|
||||
data = {"users": [["user1", "pass1", "eami1@e.com", "user1"],
|
||||
["user1", "pass1", "eami1@e.com", "user1"]]
|
||||
}
|
||||
resp = self.client.post(self.url, data)
|
||||
self.assertFailed(resp, "DETAIL: Key (username)=(user1) already exists.")
|
||||
# no user is created
|
||||
self.assertEqual(User.objects.all().count(), 2)
|
||||
|
||||
def test_delete_users(self):
|
||||
self.test_import_users()
|
||||
user_ids = User.objects.filter(username__in=["user1", "user2"]).values_list("id", flat=True)
|
||||
user_ids = ",".join([str(id) for id in user_ids])
|
||||
resp = self.client.delete(self.url + "?id=" + user_ids)
|
||||
self.assertSuccess(resp)
|
||||
self.assertEqual(User.objects.all().count(), 2)
|
||||
|
||||
|
||||
class GenerateUserAPITest(APITestCase):
|
||||
def setUp(self):
|
||||
self.create_super_admin()
|
||||
self.url = self.reverse("generate_user_api")
|
||||
self.data = {
|
||||
"number_from": 100, "number_to": 105,
|
||||
"prefix": "pre", "suffix": "suf",
|
||||
"default_email": "test@test.com",
|
||||
"password_length": 8
|
||||
}
|
||||
|
||||
def test_error_case(self):
|
||||
data = deepcopy(self.data)
|
||||
data["prefix"] = "t" * 16
|
||||
data["suffix"] = "s" * 14
|
||||
resp = self.client.post(self.url, data=data)
|
||||
self.assertEqual(resp.data["data"], "Username should not more than 32 characters")
|
||||
|
||||
data2 = deepcopy(self.data)
|
||||
data2["number_from"] = 106
|
||||
resp = self.client.post(self.url, data=data2)
|
||||
self.assertEqual(resp.data["data"], "Start number must be lower than end number")
|
||||
|
||||
@mock.patch("account.views.admin.xlsxwriter.Workbook")
|
||||
def test_generate_user_success(self, mock_workbook):
|
||||
resp = self.client.post(self.url, data=self.data)
|
||||
self.assertSuccess(resp)
|
||||
mock_workbook.assert_called()
|
||||
|
||||
|
||||
class OpenAPIAppkeyAPITest(APITestCase):
|
||||
def setUp(self):
|
||||
self.user = self.create_super_admin()
|
||||
self.url = self.reverse("open_api_appkey_api")
|
||||
|
||||
def test_reset_appkey(self):
|
||||
resp = self.client.post(self.url, data={})
|
||||
self.assertFailed(resp)
|
||||
|
||||
self.user.open_api = True
|
||||
self.user.save()
|
||||
resp = self.client.post(self.url, data={})
|
||||
self.assertSuccess(resp)
|
||||
self.assertEqual(resp.data["data"]["appkey"], User.objects.get(username=self.user.username).open_api_appkey)
|
||||
@@ -1,9 +1,8 @@
|
||||
from django.urls import path
|
||||
|
||||
from ..views.admin import GenerateUserAPI, ResetUserPasswordAPI, UserAdminAPI
|
||||
from ..views.admin import UserAdminAPI, GenerateUserAPI
|
||||
|
||||
urlpatterns = [
|
||||
path("user", UserAdminAPI.as_view()),
|
||||
path("generate_user", GenerateUserAPI.as_view()),
|
||||
path("reset_password", ResetUserPasswordAPI.as_view()),
|
||||
]
|
||||
|
||||
@@ -1,30 +1,29 @@
|
||||
from django.urls import path
|
||||
|
||||
from utils.captcha.views import CaptchaAPIView
|
||||
|
||||
from ..views.oj import (
|
||||
SSOAPI,
|
||||
ApplyResetPasswordAPI,
|
||||
AvatarUploadAPI,
|
||||
CheckTFARequiredAPI,
|
||||
Metrics,
|
||||
OpenAPIAppkeyAPI,
|
||||
ProfileProblemDisplayIDRefreshAPI,
|
||||
ResetPasswordAPI,
|
||||
SessionManagementAPI,
|
||||
TwoFactorAuthAPI,
|
||||
UserActivityRankAPI,
|
||||
UserChangeEmailAPI,
|
||||
UserChangePasswordAPI,
|
||||
Metrics,
|
||||
UserRegisterAPI,
|
||||
UserChangeEmailAPI,
|
||||
UserLoginAPI,
|
||||
UserLogoutAPI,
|
||||
UsernameOrEmailCheck,
|
||||
UserProblemRankAPI,
|
||||
AvatarUploadAPI,
|
||||
TwoFactorAuthAPI,
|
||||
UserProfileAPI,
|
||||
UserRankAPI,
|
||||
UserRegisterAPI,
|
||||
UserActivityRankAPI,
|
||||
CheckTFARequiredAPI,
|
||||
SessionManagementAPI,
|
||||
ProfileProblemDisplayIDRefreshAPI,
|
||||
OpenAPIAppkeyAPI,
|
||||
SSOAPI,
|
||||
)
|
||||
|
||||
from utils.captcha.views import CaptchaAPIView
|
||||
|
||||
urlpatterns = [
|
||||
path("login", UserLoginAPI.as_view()),
|
||||
path("logout", UserLogoutAPI.as_view()),
|
||||
@@ -46,7 +45,6 @@ urlpatterns = [
|
||||
),
|
||||
path("user_rank", UserRankAPI.as_view()),
|
||||
path("user_activity_rank", UserActivityRankAPI.as_view()),
|
||||
path("user_problem_rank", UserProblemRankAPI.as_view()),
|
||||
path("sessions", SessionManagementAPI.as_view()),
|
||||
path(
|
||||
"open_api_appkey",
|
||||
|
||||
@@ -1,12 +1,11 @@
|
||||
import os
|
||||
import re
|
||||
|
||||
import xlsxwriter
|
||||
from django.contrib.auth.hashers import make_password
|
||||
from django.db import IntegrityError, transaction
|
||||
from django.db.models import F, Q
|
||||
|
||||
from django.db import transaction, IntegrityError
|
||||
from django.db.models import Q
|
||||
from django.http import HttpResponse
|
||||
from django.utils.crypto import get_random_string
|
||||
from django.contrib.auth.hashers import make_password
|
||||
|
||||
from submission.models import Submission
|
||||
from utils.api import APIView, validate_serializer
|
||||
@@ -16,23 +15,10 @@ from ..decorators import super_admin_required
|
||||
from ..models import AdminType, ProblemPermission, User, UserProfile
|
||||
from ..serializers import (
|
||||
EditUserSerializer,
|
||||
GenerateUserSerializer,
|
||||
ImportUserSerializer,
|
||||
UserAdminSerializer,
|
||||
GenerateUserSerializer,
|
||||
)
|
||||
|
||||
|
||||
# ks251XXX 或者 ks2510XX 返回 251 或者 2510
|
||||
# 其他返回 None
|
||||
def get_class_name(username):
|
||||
if username.startswith("ks"):
|
||||
result = re.search(r"ks\d+", username)
|
||||
if result:
|
||||
return result.group(0)[2:]
|
||||
else:
|
||||
return None
|
||||
else:
|
||||
return None
|
||||
from ..serializers import ImportUserSerializer
|
||||
|
||||
|
||||
class UserAdminAPI(APIView):
|
||||
@@ -54,7 +40,6 @@ class UserAdminAPI(APIView):
|
||||
password=make_password(user_data[1]),
|
||||
email=user_data[2],
|
||||
raw_password=user_data[1],
|
||||
class_name=get_class_name(user_data[0]),
|
||||
)
|
||||
)
|
||||
|
||||
@@ -100,13 +85,12 @@ class UserAdminAPI(APIView):
|
||||
|
||||
pre_username = user.username
|
||||
user.username = data["username"].lower()
|
||||
user.class_name = get_class_name(data["username"])
|
||||
user.email = data["email"].lower()
|
||||
user.admin_type = data["admin_type"]
|
||||
user.is_disabled = data["is_disabled"]
|
||||
|
||||
if data["admin_type"] == AdminType.ADMIN:
|
||||
user.problem_permission = data["problem_permission"] or ProblemPermission.OWN
|
||||
user.problem_permission = data["problem_permission"]
|
||||
elif data["admin_type"] == AdminType.SUPER_ADMIN:
|
||||
user.problem_permission = ProblemPermission.ALL
|
||||
else:
|
||||
@@ -154,21 +138,12 @@ class UserAdminAPI(APIView):
|
||||
return self.error("User does not exist")
|
||||
return self.success(UserAdminSerializer(user).data)
|
||||
|
||||
# 获取排序参数
|
||||
order_by = request.GET.get("order_by", "")
|
||||
|
||||
# 根据排序参数设置排序规则
|
||||
if order_by == "-last_login":
|
||||
# 最近登录,将 None 值放在最后
|
||||
user = User.objects.all().order_by(F("last_login").desc(nulls_last=True))
|
||||
else:
|
||||
# 默认按创建时间倒序
|
||||
user = User.objects.all().order_by("-create_time")
|
||||
user = User.objects.all().order_by("-create_time")
|
||||
|
||||
type = request.GET.get("type", "")
|
||||
is_admin = request.GET.get("admin", "0")
|
||||
|
||||
if type:
|
||||
user = user.filter(admin_type=type)
|
||||
if is_admin == "1":
|
||||
user = user.exclude(admin_type=AdminType.REGULAR_USER)
|
||||
|
||||
keyword = request.GET.get("keyword", None)
|
||||
if keyword:
|
||||
@@ -264,27 +239,3 @@ class GenerateUserAPI(APIView):
|
||||
# duplicate key value violates unique constraint "user_username_key"
|
||||
# DETAIL: Key (username)=(root11) already exists.
|
||||
return self.error(str(e).split("\n")[1])
|
||||
|
||||
|
||||
class ResetUserPasswordAPI(APIView):
|
||||
@super_admin_required
|
||||
def post(self, request):
|
||||
"""
|
||||
重置用户密码为随机6位数字(不包括0)
|
||||
"""
|
||||
data = request.data
|
||||
user_id = data["id"]
|
||||
|
||||
try:
|
||||
user = User.objects.get(id=user_id)
|
||||
except User.DoesNotExist:
|
||||
return self.error("User does not exist")
|
||||
|
||||
# 生成6位随机数字密码(不包括0)
|
||||
new_password = get_random_string(6, allowed_chars="123456789")
|
||||
|
||||
# 设置新密码
|
||||
user.set_password(new_password)
|
||||
user.save()
|
||||
|
||||
return self.success(new_password)
|
||||
@@ -2,62 +2,47 @@ import os
|
||||
from datetime import timedelta
|
||||
from importlib import import_module
|
||||
|
||||
import qrcode
|
||||
from django.conf import settings
|
||||
from django.contrib import auth
|
||||
from django.core.cache import cache
|
||||
from django.db.models import Count, Q
|
||||
from django.template.loader import render_to_string
|
||||
from django.utils import timezone
|
||||
from django.utils.decorators import method_decorator
|
||||
from django.utils.timezone import now
|
||||
from django.views.decorators.csrf import csrf_exempt, ensure_csrf_cookie
|
||||
from django.views.decorators.csrf import ensure_csrf_cookie, csrf_exempt
|
||||
from django.db.models import Count, Q
|
||||
from django.utils import timezone
|
||||
|
||||
import qrcode
|
||||
from otpauth import TOTP
|
||||
|
||||
from options.options import SysOptions
|
||||
from problem.models import Problem
|
||||
from submission.models import JudgeStatus, Submission
|
||||
from utils.api import APIView, CSRFExemptAPIView, validate_serializer
|
||||
from submission.models import Submission, JudgeStatus
|
||||
from utils.constants import ContestRuleType
|
||||
from options.options import SysOptions
|
||||
from utils.api import APIView, validate_serializer, CSRFExemptAPIView
|
||||
from utils.captcha import Captcha
|
||||
from utils.constants import CacheKey, ContestRuleType
|
||||
from utils.shortcuts import datetime2str, img2base64, rand_str
|
||||
|
||||
from utils.shortcuts import rand_str, img2base64, datetime2str
|
||||
from ..decorators import login_required
|
||||
from ..models import AdminType, User, UserProfile
|
||||
from ..models import User, UserProfile, AdminType
|
||||
from ..serializers import (
|
||||
ApplyResetPasswordSerializer,
|
||||
EditUserProfileSerializer,
|
||||
ImageUploadForm,
|
||||
RankInfoSerializer,
|
||||
ResetPasswordSerializer,
|
||||
SSOSerializer,
|
||||
TwoFactorAuthCodeSerializer,
|
||||
UserChangeEmailSerializer,
|
||||
UserChangePasswordSerializer,
|
||||
UserLoginSerializer,
|
||||
UsernameOrEmailCheckSerializer,
|
||||
UserProfileSerializer,
|
||||
UserRegisterSerializer,
|
||||
UsernameOrEmailCheckSerializer,
|
||||
RankInfoSerializer,
|
||||
UserChangeEmailSerializer,
|
||||
SSOSerializer,
|
||||
)
|
||||
from ..serializers import (
|
||||
TwoFactorAuthCodeSerializer,
|
||||
UserProfileSerializer,
|
||||
EditUserProfileSerializer,
|
||||
ImageUploadForm,
|
||||
)
|
||||
from ..tasks import send_email_async
|
||||
|
||||
|
||||
def _totp(token):
|
||||
return TOTP(token.encode("utf-8"))
|
||||
|
||||
|
||||
def _totp_uri(token, label, issuer):
|
||||
return _totp(token).to_uri(label, issuer)
|
||||
|
||||
|
||||
def _valid_totp(token, code):
|
||||
try:
|
||||
code = int(code)
|
||||
except (TypeError, ValueError):
|
||||
return False
|
||||
return _totp(token).verify(code)
|
||||
|
||||
|
||||
class UserProfileAPI(APIView):
|
||||
@method_decorator(ensure_csrf_cookie)
|
||||
def get(self, request, **kwargs):
|
||||
@@ -158,7 +143,9 @@ class TwoFactorAuthAPI(APIView):
|
||||
|
||||
label = f"{SysOptions.website_name_shortcut}:{user.username}"
|
||||
image = qrcode.make(
|
||||
_totp_uri(token, label, SysOptions.website_name.replace(" ", ""))
|
||||
TOTP(token).to_uri(
|
||||
"totp", label, SysOptions.website_name.replace(" ", "")
|
||||
)
|
||||
)
|
||||
return self.success(img2base64(image))
|
||||
|
||||
@@ -170,7 +157,7 @@ class TwoFactorAuthAPI(APIView):
|
||||
"""
|
||||
code = request.data["code"]
|
||||
user = request.user
|
||||
if _valid_totp(user.tfa_token, code):
|
||||
if TOTP(user.tfa_token).verify(code):
|
||||
user.two_factor_auth = True
|
||||
user.save()
|
||||
return self.success("Succeeded")
|
||||
@@ -184,7 +171,7 @@ class TwoFactorAuthAPI(APIView):
|
||||
user = request.user
|
||||
if not user.two_factor_auth:
|
||||
return self.error("2FA is already turned off")
|
||||
if _valid_totp(user.tfa_token, code):
|
||||
if TOTP(user.tfa_token).verify(code):
|
||||
user.two_factor_auth = False
|
||||
user.save()
|
||||
return self.success("Succeeded")
|
||||
@@ -222,23 +209,15 @@ class UserLoginAPI(APIView):
|
||||
if user.is_disabled:
|
||||
return self.error("Your account has been disabled")
|
||||
if not user.two_factor_auth:
|
||||
prev_login = user.last_login
|
||||
auth.login(request, user)
|
||||
request.session["prev_login"] = (
|
||||
datetime2str(prev_login) if prev_login else ""
|
||||
)
|
||||
return self.success("Succeeded")
|
||||
|
||||
# `tfa_code` not in post data
|
||||
if user.two_factor_auth and "tfa_code" not in data:
|
||||
return self.error("tfa_required")
|
||||
|
||||
if _valid_totp(user.tfa_token, data["tfa_code"]):
|
||||
prev_login = user.last_login
|
||||
if TOTP(user.tfa_token).verify(data["tfa_code"]):
|
||||
auth.login(request, user)
|
||||
request.session["prev_login"] = (
|
||||
datetime2str(prev_login) if prev_login else ""
|
||||
)
|
||||
return self.success("Succeeded")
|
||||
else:
|
||||
return self.error("Invalid two factor verification code")
|
||||
@@ -308,7 +287,7 @@ class UserChangeEmailAPI(APIView):
|
||||
if user.two_factor_auth:
|
||||
if "tfa_code" not in data:
|
||||
return self.error("tfa_required")
|
||||
if not _valid_totp(user.tfa_token, data["tfa_code"]):
|
||||
if not TOTP(user.tfa_token).verify(data["tfa_code"]):
|
||||
return self.error("Invalid two factor verification code")
|
||||
data["new_email"] = data["new_email"].lower()
|
||||
if User.objects.filter(email=data["new_email"]).exists():
|
||||
@@ -334,7 +313,7 @@ class UserChangePasswordAPI(APIView):
|
||||
if user.two_factor_auth:
|
||||
if "tfa_code" not in data:
|
||||
return self.error("tfa_required")
|
||||
if not _valid_totp(user.tfa_token, data["tfa_code"]):
|
||||
if not TOTP(user.tfa_token).verify(data["tfa_code"]):
|
||||
return self.error("Invalid two factor verification code")
|
||||
user.set_password(data["new_password"])
|
||||
user.save()
|
||||
@@ -455,9 +434,8 @@ class UserRankAPI(APIView):
|
||||
n = 0
|
||||
if rule_type not in ContestRuleType.choices():
|
||||
rule_type = ContestRuleType.ACM
|
||||
|
||||
profiles = UserProfile.objects.filter(
|
||||
user__admin_type__in=[AdminType.REGULAR_USER, AdminType.ADMIN],
|
||||
user__admin_type=AdminType.REGULAR_USER,
|
||||
user__is_disabled=False,
|
||||
user__username__icontains=username,
|
||||
).select_related("user")
|
||||
@@ -477,79 +455,24 @@ class UserActivityRankAPI(APIView):
|
||||
start = request.GET.get("start")
|
||||
if not start:
|
||||
return self.error("start time is required")
|
||||
cache_key = f"{CacheKey.user_activity_rank}:{start}"
|
||||
cached = cache.get(cache_key)
|
||||
if cached is not None:
|
||||
return self.success(cached)
|
||||
|
||||
hidden_names = User.objects.filter(
|
||||
Q(admin_type=AdminType.SUPER_ADMIN) | Q(is_disabled=True)
|
||||
Q(admin_type=AdminType.SUPER_ADMIN)
|
||||
| Q(admin_type=AdminType.ADMIN)
|
||||
| Q(is_disabled=True)
|
||||
).values_list("username", flat=True)
|
||||
submissions = Submission.objects.filter(
|
||||
contest_id__isnull=True,
|
||||
create_time__gte=start,
|
||||
result=JudgeStatus.ACCEPTED,
|
||||
).exclude(username__in=hidden_names)
|
||||
data = list(
|
||||
contest_id__isnull=True, create_time__gte=start, result=JudgeStatus.ACCEPTED
|
||||
)
|
||||
counts = (
|
||||
submissions.values("username")
|
||||
.annotate(count=Count("problem_id", distinct=True))
|
||||
.order_by("-count")[:10]
|
||||
)
|
||||
cache.set(cache_key, data, 600)
|
||||
return self.success(data)
|
||||
|
||||
|
||||
class UserProblemRankAPI(APIView):
|
||||
def get(self, request):
|
||||
problem_id = request.GET.get("problem_id")
|
||||
user = request.user
|
||||
if not user.is_authenticated:
|
||||
return self.error("User is not authenticated")
|
||||
|
||||
problem = Problem.objects.get(
|
||||
_id=problem_id, contest_id__isnull=True, visible=True
|
||||
)
|
||||
submissions = Submission.objects.filter(
|
||||
problem=problem, result=JudgeStatus.ACCEPTED
|
||||
)
|
||||
|
||||
all_ac_count = submissions.values("user_id").distinct().count()
|
||||
|
||||
class_name = user.class_name or ""
|
||||
class_ac_count = 0
|
||||
|
||||
if class_name:
|
||||
users = User.objects.filter(
|
||||
class_name=user.class_name, is_disabled=False
|
||||
).values_list("id", flat=True)
|
||||
user_ids = list(users)
|
||||
submissions = submissions.filter(user_id__in=user_ids)
|
||||
class_ac_count = submissions.values("user_id").distinct().count()
|
||||
|
||||
my_submissions = submissions.filter(user_id=user.id)
|
||||
|
||||
if len(my_submissions) == 0:
|
||||
return self.success(
|
||||
{
|
||||
"class_name": class_name,
|
||||
"rank": -1,
|
||||
"class_ac_count": class_ac_count,
|
||||
"all_ac_count": all_ac_count,
|
||||
}
|
||||
)
|
||||
|
||||
my_first_submission = my_submissions.order_by("create_time").first()
|
||||
rank = submissions.filter(
|
||||
create_time__lte=my_first_submission.create_time
|
||||
).count()
|
||||
return self.success(
|
||||
{
|
||||
"class_name": class_name,
|
||||
"rank": rank,
|
||||
"class_ac_count": class_ac_count,
|
||||
"all_ac_count": all_ac_count,
|
||||
}
|
||||
.order_by("-count")[: 10 + len(hidden_names)]
|
||||
)
|
||||
data = []
|
||||
for count in counts:
|
||||
if count["username"] not in hidden_names:
|
||||
data.append(count)
|
||||
return self.success(data[:10])
|
||||
|
||||
|
||||
class ProfileProblemDisplayIDRefreshAPI(APIView):
|
||||
|
||||
@@ -1,6 +0,0 @@
|
||||
from django.apps import AppConfig
|
||||
|
||||
|
||||
class AiConfig(AppConfig):
|
||||
default_auto_field = 'django.db.models.BigAutoField'
|
||||
name = 'ai'
|
||||
@@ -1,34 +0,0 @@
|
||||
# Generated by Django 5.2.3 on 2025-09-24 12:59
|
||||
|
||||
import django.db.models.deletion
|
||||
from django.conf import settings
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
initial = True
|
||||
|
||||
dependencies = [
|
||||
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.CreateModel(
|
||||
name='AIAnalysis',
|
||||
fields=[
|
||||
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('provider', models.TextField(default='deepseek')),
|
||||
('data', models.JSONField()),
|
||||
('system_prompt', models.TextField()),
|
||||
('user_prompt', models.TextField()),
|
||||
('analysis', models.TextField()),
|
||||
('create_time', models.DateTimeField(auto_now_add=True)),
|
||||
('user', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
|
||||
],
|
||||
options={
|
||||
'db_table': 'ai_analysis',
|
||||
'ordering': ['-create_time'],
|
||||
},
|
||||
),
|
||||
]
|
||||
@@ -1,18 +0,0 @@
|
||||
# Generated by Django 5.2.3 on 2025-09-24 13:02
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('ai', '0001_initial'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='aianalysis',
|
||||
name='model',
|
||||
field=models.TextField(default='deepseek-chat'),
|
||||
),
|
||||
]
|
||||
@@ -1,18 +0,0 @@
|
||||
# Generated by Django 6.0 on 2026-04-27 12:31
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('ai', '0002_aianalysis_model'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name='aianalysis',
|
||||
name='model',
|
||||
field=models.TextField(default='deepseek-v4-flash'),
|
||||
),
|
||||
]
|
||||
18
ai/models.py
18
ai/models.py
@@ -1,18 +0,0 @@
|
||||
from django.db import models
|
||||
|
||||
from account.models import User
|
||||
|
||||
|
||||
class AIAnalysis(models.Model):
|
||||
user = models.ForeignKey(User, on_delete=models.CASCADE)
|
||||
provider = models.TextField(default="deepseek")
|
||||
model = models.TextField(default="deepseek-v4-flash")
|
||||
data = models.JSONField()
|
||||
system_prompt = models.TextField()
|
||||
user_prompt = models.TextField()
|
||||
analysis = models.TextField()
|
||||
create_time = models.DateTimeField(auto_now_add=True)
|
||||
|
||||
class Meta:
|
||||
db_table = "ai_analysis"
|
||||
ordering = ["-create_time"]
|
||||
@@ -1,19 +0,0 @@
|
||||
from django.urls import path
|
||||
|
||||
from ..views.oj import (
|
||||
AIAnalysisAPI,
|
||||
AIDetailDataAPI,
|
||||
AIDurationDataAPI,
|
||||
AIHeatmapDataAPI,
|
||||
AIHintAPI,
|
||||
AILoginSummaryAPI,
|
||||
)
|
||||
|
||||
urlpatterns = [
|
||||
path("ai/detail", AIDetailDataAPI.as_view()),
|
||||
path("ai/duration", AIDurationDataAPI.as_view()),
|
||||
path("ai/analysis", AIAnalysisAPI.as_view()),
|
||||
path("ai/hint", AIHintAPI.as_view()),
|
||||
path("ai/heatmap", AIHeatmapDataAPI.as_view()),
|
||||
path("ai/login_summary", AILoginSummaryAPI.as_view()),
|
||||
]
|
||||
711
ai/views/oj.py
711
ai/views/oj.py
@@ -1,711 +0,0 @@
|
||||
import hashlib
|
||||
import json
|
||||
from collections import defaultdict
|
||||
from datetime import datetime, timedelta
|
||||
|
||||
from dateutil.relativedelta import relativedelta
|
||||
from django.core.cache import cache
|
||||
from django.db.models import Count, Min
|
||||
from django.db.models.functions import TruncDate
|
||||
from django.http import StreamingHttpResponse
|
||||
from django.utils import timezone
|
||||
from django.utils.dateparse import parse_datetime
|
||||
|
||||
from account.decorators import login_required
|
||||
from account.models import User
|
||||
from ai.models import AIAnalysis
|
||||
from flowchart.models import FlowchartSubmission, FlowchartSubmissionStatus
|
||||
from problem.models import Problem
|
||||
from submission.models import JudgeStatus, Submission
|
||||
from utils.api import APIView
|
||||
from utils.openai import get_ai_client
|
||||
from utils.shortcuts import datetime2str
|
||||
|
||||
CACHE_TIMEOUT = 300
|
||||
DIFFICULTY_MAP = {"Low": "简单", "Mid": "中等", "High": "困难"}
|
||||
DEFAULT_CLASS_SIZE = 45
|
||||
|
||||
# 评级阈值配置:(百分位上限, 评级)
|
||||
GRADE_THRESHOLDS = [
|
||||
(10, "S"), # 前10%: S级 - 卓越
|
||||
(35, "A"), # 前35%: A级 - 优秀
|
||||
(75, "B"), # 前75%: B级 - 良好
|
||||
(100, "C"), # 其余: C级 - 及格
|
||||
]
|
||||
|
||||
# 小规模参与惩罚配置:(最小人数, 等级降级映射)
|
||||
SMALL_SCALE_PENALTY = {
|
||||
"threshold": 10,
|
||||
"downgrade": {"S": "A", "A": "B"},
|
||||
}
|
||||
|
||||
# 等级权重映射(用于加权平均计算)
|
||||
GRADE_WEIGHTS = {"S": 4, "A": 3, "B": 2, "C": 1}
|
||||
|
||||
# 平均等级阈值:(最小权重, 等级)
|
||||
AVERAGE_GRADE_THRESHOLDS = [(3.5, "S"), (2.5, "A"), (1.5, "B")]
|
||||
|
||||
|
||||
def get_cache_key(prefix, *args):
|
||||
return hashlib.md5(f"{prefix}:{'_'.join(map(str, args))}".encode()).hexdigest()
|
||||
|
||||
|
||||
def get_difficulty(difficulty):
|
||||
return DIFFICULTY_MAP.get(difficulty, "中等")
|
||||
|
||||
|
||||
def get_grade(rank, submission_count):
|
||||
"""
|
||||
计算题目完成评级
|
||||
|
||||
评级标准:
|
||||
- S级:前10%,卓越水平(10%的人)
|
||||
- A级:前35%,优秀水平(25%的人)
|
||||
- B级:前75%,良好水平(40%的人)
|
||||
- C级:75%之后,及格水平(25%的人)
|
||||
|
||||
特殊规则:
|
||||
- 参与人数少于10人时,S级降为A级,A级降为B级(避免因人少而评级虚高)
|
||||
"""
|
||||
if not rank or rank <= 0 or submission_count <= 0:
|
||||
return "C"
|
||||
|
||||
percentile = (rank - 1) / submission_count * 100
|
||||
|
||||
base_grade = "C"
|
||||
for threshold, grade in GRADE_THRESHOLDS:
|
||||
if percentile < threshold:
|
||||
base_grade = grade
|
||||
break
|
||||
|
||||
if submission_count < SMALL_SCALE_PENALTY["threshold"]:
|
||||
base_grade = SMALL_SCALE_PENALTY["downgrade"].get(base_grade, base_grade)
|
||||
|
||||
return base_grade
|
||||
|
||||
|
||||
def calculate_average_grade(grades):
|
||||
"""根据等级列表计算加权平均等级"""
|
||||
scores = [GRADE_WEIGHTS[g] for g in grades if g in GRADE_WEIGHTS]
|
||||
if not scores:
|
||||
return ""
|
||||
avg = sum(scores) / len(scores)
|
||||
for threshold, grade in AVERAGE_GRADE_THRESHOLDS:
|
||||
if avg >= threshold:
|
||||
return grade
|
||||
return "C"
|
||||
|
||||
|
||||
def find_user_rank(ranking_list, user_id):
|
||||
"""在排名列表中找到用户的排名(1-based),未找到返回 None"""
|
||||
return next(
|
||||
(idx + 1 for idx, rec in enumerate(ranking_list) if rec["user_id"] == user_id),
|
||||
None,
|
||||
)
|
||||
|
||||
|
||||
def get_class_user_ids(user):
|
||||
if not user.class_name:
|
||||
return []
|
||||
|
||||
cache_key = get_cache_key("class_users", user.class_name)
|
||||
user_ids = cache.get(cache_key)
|
||||
if user_ids is None:
|
||||
user_ids = list(
|
||||
User.objects.filter(class_name=user.class_name).values_list("id", flat=True)
|
||||
)
|
||||
cache.set(cache_key, user_ids, CACHE_TIMEOUT)
|
||||
return user_ids
|
||||
|
||||
|
||||
def get_user_first_ac_submissions(
|
||||
user_id, start, end, class_user_ids=None, use_class_scope=False
|
||||
):
|
||||
base_qs = Submission.objects.filter(
|
||||
result=JudgeStatus.ACCEPTED, create_time__gte=start, create_time__lte=end
|
||||
)
|
||||
if use_class_scope and class_user_ids:
|
||||
base_qs = base_qs.filter(user_id__in=class_user_ids)
|
||||
|
||||
user_first_ac = list(
|
||||
base_qs.filter(user_id=user_id)
|
||||
.values("problem_id")
|
||||
.annotate(first_ac_time=Min("create_time"))
|
||||
)
|
||||
if not user_first_ac:
|
||||
return [], {}, []
|
||||
|
||||
problem_ids = [item["problem_id"] for item in user_first_ac]
|
||||
ranked_first_ac = list(
|
||||
base_qs.filter(problem_id__in=problem_ids)
|
||||
.values("user_id", "problem_id")
|
||||
.annotate(first_ac_time=Min("create_time"))
|
||||
)
|
||||
|
||||
by_problem = defaultdict(list)
|
||||
for item in ranked_first_ac:
|
||||
by_problem[item["problem_id"]].append(item)
|
||||
for submissions in by_problem.values():
|
||||
submissions.sort(key=lambda x: (x["first_ac_time"], x["user_id"]))
|
||||
|
||||
return user_first_ac, by_problem, problem_ids
|
||||
|
||||
|
||||
def stream_ai_response(client, system_prompt, user_prompt, on_complete=None):
|
||||
"""SSE 流式响应生成器,on_complete(full_text) 在流结束时调用"""
|
||||
try:
|
||||
stream = client.chat.completions.create(
|
||||
model="deepseek-reasoner",
|
||||
messages=[
|
||||
{"role": "system", "content": system_prompt},
|
||||
{"role": "user", "content": user_prompt},
|
||||
],
|
||||
stream=True,
|
||||
)
|
||||
except Exception as exc:
|
||||
yield f"data: {json.dumps({'type': 'error', 'message': str(exc)})}\n\n"
|
||||
yield "event: end\n\n"
|
||||
return
|
||||
|
||||
yield "event: start\n\n"
|
||||
chunks = []
|
||||
try:
|
||||
for chunk in stream:
|
||||
if not chunk.choices:
|
||||
continue
|
||||
choice = chunk.choices[0]
|
||||
if choice.finish_reason:
|
||||
if on_complete:
|
||||
on_complete("".join(chunks).strip())
|
||||
yield f"data: {json.dumps({'type': 'done'})}\n\n"
|
||||
break
|
||||
content = choice.delta.content
|
||||
if content:
|
||||
chunks.append(content)
|
||||
yield f"data: {json.dumps({'type': 'delta', 'content': content})}\n\n"
|
||||
except Exception as exc:
|
||||
yield f"data: {json.dumps({'type': 'error', 'message': str(exc)})}\n\n"
|
||||
finally:
|
||||
yield "event: end\n\n"
|
||||
|
||||
|
||||
def make_sse_response(generator):
|
||||
"""创建 SSE StreamingHttpResponse"""
|
||||
response = StreamingHttpResponse(
|
||||
streaming_content=generator,
|
||||
content_type="text/event-stream",
|
||||
)
|
||||
response["Cache-Control"] = "no-cache"
|
||||
return response
|
||||
|
||||
|
||||
class AIDetailDataAPI(APIView):
|
||||
@login_required
|
||||
def get(self, request):
|
||||
start = request.GET.get("start")
|
||||
end = request.GET.get("end")
|
||||
|
||||
user = request.user
|
||||
|
||||
cache_key = get_cache_key(
|
||||
"ai_detail", user.id, user.class_name or "", start, end
|
||||
)
|
||||
cached_result = cache.get(cache_key)
|
||||
if cached_result:
|
||||
return self.success(cached_result)
|
||||
|
||||
class_user_ids = get_class_user_ids(user)
|
||||
use_class_scope = bool(user.class_name) and len(class_user_ids) > 1
|
||||
user_first_ac, by_problem, problem_ids = get_user_first_ac_submissions(
|
||||
user.id, start, end, class_user_ids, use_class_scope
|
||||
)
|
||||
|
||||
result = {
|
||||
"user": user.username,
|
||||
"class_name": user.class_name,
|
||||
"start": start,
|
||||
"end": end,
|
||||
"solved": [],
|
||||
"flowcharts": [],
|
||||
"grade": "",
|
||||
"tags": {},
|
||||
"difficulty": {},
|
||||
"contest_count": 0,
|
||||
}
|
||||
|
||||
if user_first_ac:
|
||||
problems = {
|
||||
p.id: p
|
||||
for p in Problem.objects.filter(id__in=problem_ids)
|
||||
.select_related("contest")
|
||||
.prefetch_related("tags")
|
||||
}
|
||||
solved, contest_ids = self._build_solved_records(
|
||||
user_first_ac, by_problem, problems, user.id
|
||||
)
|
||||
# 查找 flowchart submissions
|
||||
flowcharts_query = FlowchartSubmission.objects.filter(
|
||||
user_id=user,
|
||||
status=FlowchartSubmissionStatus.COMPLETED,
|
||||
)
|
||||
|
||||
# 添加时间范围过滤
|
||||
if start:
|
||||
flowcharts_query = flowcharts_query.filter(create_time__gte=start)
|
||||
if end:
|
||||
flowcharts_query = flowcharts_query.filter(create_time__lte=end)
|
||||
|
||||
flowcharts = flowcharts_query.select_related("problem").only(
|
||||
"id",
|
||||
"create_time",
|
||||
"ai_score",
|
||||
"ai_grade",
|
||||
"problem___id",
|
||||
"problem__title",
|
||||
)
|
||||
|
||||
# 按problem分组
|
||||
problem_groups = defaultdict(list)
|
||||
for flowchart in flowcharts:
|
||||
problem_id = flowchart.problem._id
|
||||
problem_groups[problem_id].append(flowchart)
|
||||
|
||||
flowcharts_data = []
|
||||
for problem_id, submissions in problem_groups.items():
|
||||
if not submissions:
|
||||
continue
|
||||
|
||||
# 获取第一个提交的基本信息
|
||||
first_submission = submissions[0]
|
||||
|
||||
# 计算统计数据
|
||||
scores = [s.ai_score for s in submissions if s.ai_score is not None]
|
||||
times = [s.create_time for s in submissions]
|
||||
|
||||
# 找到最高分和对应的等级
|
||||
best_score = max(scores) if scores else 0
|
||||
best_submission = next(
|
||||
(s for s in submissions if s.ai_score == best_score), submissions[0]
|
||||
)
|
||||
best_grade = best_submission.ai_grade or ""
|
||||
|
||||
# 计算平均分
|
||||
avg_score = sum(scores) / len(scores) if scores else 0
|
||||
|
||||
# 最新提交时间
|
||||
latest_time = max(times) if times else first_submission.create_time
|
||||
|
||||
merged_item = {
|
||||
"problem__id": problem_id,
|
||||
"problem_title": first_submission.problem.title,
|
||||
"submission_count": len(submissions),
|
||||
"best_score": best_score,
|
||||
"best_grade": best_grade,
|
||||
"latest_submission_time": latest_time.isoformat() if latest_time else None,
|
||||
"avg_score": round(avg_score, 0),
|
||||
}
|
||||
|
||||
flowcharts_data.append(merged_item)
|
||||
|
||||
# 按最新提交时间排序
|
||||
flowcharts_data.sort(
|
||||
key=lambda x: x["latest_submission_time"] or "", reverse=True
|
||||
)
|
||||
|
||||
result.update(
|
||||
{
|
||||
"solved": solved,
|
||||
"flowcharts": flowcharts_data,
|
||||
"grade": calculate_average_grade([s["grade"] for s in solved]),
|
||||
"tags": self._calculate_top_tags(problems.values()),
|
||||
"difficulty": self._calculate_difficulty_distribution(
|
||||
problems.values()
|
||||
),
|
||||
"contest_count": len(set(contest_ids)),
|
||||
}
|
||||
)
|
||||
|
||||
cache.set(cache_key, result, CACHE_TIMEOUT)
|
||||
return self.success(result)
|
||||
|
||||
def _build_solved_records(self, user_first_ac, by_problem, problems, user_id):
|
||||
solved, contest_ids = [], []
|
||||
for item in user_first_ac:
|
||||
pid = item["problem_id"]
|
||||
problem = problems.get(pid)
|
||||
if not problem:
|
||||
continue
|
||||
|
||||
ranking_list = by_problem.get(pid, [])
|
||||
rank = find_user_rank(ranking_list, user_id)
|
||||
|
||||
if problem.contest_id:
|
||||
contest_ids.append(problem.contest_id)
|
||||
|
||||
solved.append(
|
||||
{
|
||||
"problem": {
|
||||
"display_id": problem._id,
|
||||
"title": problem.title,
|
||||
"contest_id": problem.contest_id,
|
||||
"contest_title": getattr(problem.contest, "title", ""),
|
||||
},
|
||||
"ac_time": timezone.localtime(item["first_ac_time"]).isoformat(),
|
||||
"rank": rank,
|
||||
"ac_count": len(ranking_list),
|
||||
"grade": get_grade(rank, len(ranking_list)),
|
||||
"difficulty": get_difficulty(problem.difficulty),
|
||||
}
|
||||
)
|
||||
|
||||
return sorted(solved, key=lambda x: x["ac_time"]), contest_ids
|
||||
|
||||
def _calculate_top_tags(self, problems):
|
||||
tags_counter = defaultdict(int)
|
||||
for problem in problems:
|
||||
for tag in problem.tags.all():
|
||||
if tag.name:
|
||||
tags_counter[tag.name] += 1
|
||||
return dict(sorted(tags_counter.items(), key=lambda x: x[1], reverse=True)[:5])
|
||||
|
||||
def _calculate_difficulty_distribution(self, problems):
|
||||
diff_counter = {"Low": 0, "Mid": 0, "High": 0}
|
||||
for problem in problems:
|
||||
diff_counter[
|
||||
problem.difficulty if problem.difficulty in diff_counter else "Mid"
|
||||
] += 1
|
||||
return {
|
||||
get_difficulty(k): v
|
||||
for k, v in sorted(diff_counter.items(), key=lambda x: x[1], reverse=True)
|
||||
}
|
||||
|
||||
|
||||
class AIDurationDataAPI(APIView):
|
||||
@login_required
|
||||
def get(self, request):
|
||||
end_iso = request.GET.get("end")
|
||||
duration = request.GET.get("duration")
|
||||
|
||||
user = request.user
|
||||
|
||||
cache_key = get_cache_key(
|
||||
"ai_duration", user.id, user.class_name or "", end_iso, duration
|
||||
)
|
||||
cached_result = cache.get(cache_key)
|
||||
if cached_result:
|
||||
return self.success(cached_result)
|
||||
|
||||
class_user_ids = get_class_user_ids(user)
|
||||
use_class_scope = bool(user.class_name) and len(class_user_ids) > 1
|
||||
time_config = self._parse_duration(duration)
|
||||
start = datetime.fromisoformat(end_iso) - time_config["total_delta"]
|
||||
|
||||
duration_data = []
|
||||
for i in range(time_config["show_count"]):
|
||||
start = start + time_config["delta"]
|
||||
period_end = start + time_config["delta"]
|
||||
|
||||
submission_count = Submission.objects.filter(
|
||||
user_id=user.id, create_time__gte=start, create_time__lte=period_end
|
||||
).count()
|
||||
|
||||
period_data = {
|
||||
"unit": time_config["show_unit"],
|
||||
"index": time_config["show_count"] - 1 - i,
|
||||
"start": start.isoformat(),
|
||||
"end": period_end.isoformat(),
|
||||
"problem_count": 0,
|
||||
"submission_count": submission_count,
|
||||
"grade": "",
|
||||
}
|
||||
|
||||
if submission_count > 0:
|
||||
user_first_ac, by_problem, problem_ids = get_user_first_ac_submissions(
|
||||
user.id,
|
||||
start.isoformat(),
|
||||
period_end.isoformat(),
|
||||
class_user_ids,
|
||||
use_class_scope,
|
||||
)
|
||||
if user_first_ac:
|
||||
period_data["problem_count"] = len(problem_ids)
|
||||
grades = [
|
||||
get_grade(
|
||||
find_user_rank(by_problem.get(item["problem_id"], []), user.id),
|
||||
len(by_problem.get(item["problem_id"], [])),
|
||||
)
|
||||
for item in user_first_ac
|
||||
]
|
||||
period_data["grade"] = calculate_average_grade(grades)
|
||||
|
||||
duration_data.append(period_data)
|
||||
|
||||
cache.set(cache_key, duration_data, CACHE_TIMEOUT)
|
||||
return self.success(duration_data)
|
||||
|
||||
def _parse_duration(self, duration):
|
||||
unit, count = duration.split(":")
|
||||
count = int(count)
|
||||
|
||||
configs = {
|
||||
("months", 2): {
|
||||
"show_count": 8,
|
||||
"show_unit": "weeks",
|
||||
"total_delta": timedelta(weeks=9),
|
||||
"delta": timedelta(weeks=1),
|
||||
},
|
||||
("months", 6): {
|
||||
"show_count": 6,
|
||||
"show_unit": "months",
|
||||
"total_delta": relativedelta(months=7),
|
||||
"delta": relativedelta(months=1),
|
||||
},
|
||||
("years", 1): {
|
||||
"show_count": 12,
|
||||
"show_unit": "months",
|
||||
"total_delta": relativedelta(months=13),
|
||||
"delta": relativedelta(months=1),
|
||||
},
|
||||
}
|
||||
|
||||
return configs.get(
|
||||
(unit, count),
|
||||
{
|
||||
"show_count": 4,
|
||||
"show_unit": "weeks",
|
||||
"total_delta": timedelta(weeks=5),
|
||||
"delta": timedelta(weeks=1),
|
||||
},
|
||||
)
|
||||
|
||||
|
||||
|
||||
class AILoginSummaryAPI(APIView):
|
||||
@login_required
|
||||
def get(self, request):
|
||||
user = request.user
|
||||
end_time = timezone.now()
|
||||
start_time = self._resolve_start_time(request, user, end_time)
|
||||
|
||||
problems_qs = Problem.objects.filter(
|
||||
create_time__gte=start_time,
|
||||
create_time__lte=end_time,
|
||||
contest_id__isnull=True,
|
||||
visible=True,
|
||||
)
|
||||
new_problem_count = problems_qs.count()
|
||||
|
||||
submissions_qs = Submission.objects.filter(
|
||||
user_id=user.id, create_time__gte=start_time, create_time__lte=end_time
|
||||
)
|
||||
submission_count = submissions_qs.count()
|
||||
accepted_count = submissions_qs.filter(result=JudgeStatus.ACCEPTED).count()
|
||||
solved_count = (
|
||||
submissions_qs.filter(result=JudgeStatus.ACCEPTED)
|
||||
.values("problem_id")
|
||||
.distinct()
|
||||
.count()
|
||||
)
|
||||
flowchart_submission_count = FlowchartSubmission.objects.filter(
|
||||
user_id=user.id, create_time__gte=start_time, create_time__lte=end_time
|
||||
).count()
|
||||
|
||||
summary = {
|
||||
"start": datetime2str(start_time),
|
||||
"end": datetime2str(end_time),
|
||||
"new_problem_count": new_problem_count,
|
||||
"submission_count": submission_count,
|
||||
"accepted_count": accepted_count,
|
||||
"solved_count": solved_count,
|
||||
"flowchart_submission_count": flowchart_submission_count,
|
||||
}
|
||||
|
||||
analysis = ""
|
||||
analysis_error = ""
|
||||
if submission_count >= 3:
|
||||
analysis, analysis_error = self._get_ai_analysis(summary)
|
||||
|
||||
data = {"summary": summary, "analysis": analysis}
|
||||
if analysis_error:
|
||||
data["analysis_error"] = analysis_error
|
||||
return self.success(data)
|
||||
|
||||
def _resolve_start_time(self, request, user, end_time):
|
||||
start_raw = request.session.get("prev_login") or request.GET.get("start")
|
||||
start_time = parse_datetime(start_raw) if start_raw else None
|
||||
|
||||
if start_time and timezone.is_naive(start_time):
|
||||
start_time = timezone.make_aware(
|
||||
start_time, timezone.get_current_timezone()
|
||||
)
|
||||
|
||||
if not start_time:
|
||||
if user.last_login and user.last_login < end_time:
|
||||
start_time = user.last_login
|
||||
elif user.create_time:
|
||||
start_time = user.create_time
|
||||
else:
|
||||
start_time = end_time - timedelta(days=7)
|
||||
|
||||
if start_time >= end_time:
|
||||
start_time = end_time - timedelta(days=1)
|
||||
|
||||
return start_time
|
||||
|
||||
def _get_ai_analysis(self, summary):
|
||||
try:
|
||||
client = get_ai_client()
|
||||
except Exception as exc:
|
||||
return "", str(exc)
|
||||
|
||||
system_prompt = (
|
||||
"你是 OnlineJudge 的学习助教。"
|
||||
"请根据统计数据给出简短分析(1-2句),再给出一行结论,"
|
||||
"结论用“结论:”开头。"
|
||||
)
|
||||
user_prompt = (
|
||||
f"时间范围:{summary['start']} 到 {summary['end']}\n"
|
||||
f"新题目数:{summary['new_problem_count']}\n"
|
||||
f"提交次数:{summary['submission_count']}\n"
|
||||
f"AC 次数:{summary['accepted_count']}\n"
|
||||
f"AC 题目数:{summary['solved_count']}\n"
|
||||
f"流程图提交数:{summary['flowchart_submission_count']}\n"
|
||||
)
|
||||
|
||||
try:
|
||||
completion = client.chat.completions.create(
|
||||
model="deepseek-reasoner",
|
||||
messages=[
|
||||
{"role": "system", "content": system_prompt},
|
||||
{"role": "user", "content": user_prompt},
|
||||
],
|
||||
)
|
||||
except Exception as exc:
|
||||
return "", str(exc)
|
||||
|
||||
if not completion.choices:
|
||||
return "", ""
|
||||
|
||||
content = completion.choices[0].message.content or ""
|
||||
return content.strip(), ""
|
||||
|
||||
class AIAnalysisAPI(APIView):
|
||||
@login_required
|
||||
def post(self, request):
|
||||
details = request.data.get("details")
|
||||
duration = request.data.get("duration")
|
||||
|
||||
client = get_ai_client()
|
||||
|
||||
system_prompt = (
|
||||
"你是一个风趣的编程老师,学生使用判题狗平台进行编程练习。"
|
||||
"请根据学生提供的详细数据和每周数据,给出用户的学习建议,最后写一句鼓励学生的话。"
|
||||
"请使用 markdown 格式输出,不要在代码块中输出。"
|
||||
)
|
||||
user_prompt = f"这段时间内的详细数据: {details}\n(其中部分字段含义是 flowcharts:流程图的提交,solved:代码的提交)\n每周或每月的数据: {duration}"
|
||||
|
||||
def on_complete(full_text):
|
||||
AIAnalysis.objects.create(
|
||||
user=request.user,
|
||||
provider="deepseek",
|
||||
model="deepseek-reasoner",
|
||||
data={"details": details, "duration": duration},
|
||||
system_prompt=system_prompt,
|
||||
user_prompt="这段时间内的详细数据,每周或每月的数据。",
|
||||
analysis=full_text,
|
||||
)
|
||||
|
||||
return make_sse_response(
|
||||
stream_ai_response(client, system_prompt, user_prompt, on_complete)
|
||||
)
|
||||
|
||||
|
||||
class AIHintAPI(APIView):
|
||||
@login_required
|
||||
def post(self, request):
|
||||
submission_id = request.data.get("submission_id")
|
||||
if not submission_id:
|
||||
return self.error("submission_id is required")
|
||||
|
||||
try:
|
||||
submission = Submission.objects.get(id=submission_id, user_id=request.user.id)
|
||||
except Submission.DoesNotExist:
|
||||
return self.error("Submission not found")
|
||||
|
||||
problem = submission.problem
|
||||
client = get_ai_client()
|
||||
|
||||
# 获取参考答案(同语言优先,否则取第一个)
|
||||
answers = problem.answers or []
|
||||
ref_answer = next(
|
||||
(a["code"] for a in answers if a["language"] == submission.language),
|
||||
answers[0]["code"] if answers else "",
|
||||
)
|
||||
|
||||
system_prompt = (
|
||||
"你是编程助教。你知道题目的参考答案,但【绝对禁止】把参考答案或其中任何代码"
|
||||
"直接告诉学生,也不能以任何形式暗示完整解法。"
|
||||
"你的任务是:对照参考答案,找出学生代码中的问题,"
|
||||
"给出方向性提示(例如:指出哪类边界情况需要考虑、"
|
||||
"哪个算法思路更合适、哪行代码逻辑可能有问题等)。"
|
||||
"语气鼓励,回复简洁(3-5句话),使用 Markdown 格式。"
|
||||
)
|
||||
user_prompt = (
|
||||
f"题目:{problem.title}\n"
|
||||
f"题目描述:{problem.description[:500]}\n"
|
||||
f"参考答案(仅供你分析,不可透露给学生):\n```\n{ref_answer[:2000]}\n```\n"
|
||||
f"学生提交语言:{submission.language}\n"
|
||||
f"判题结果:{submission.result}\n"
|
||||
f"错误信息:{submission.statistic_info.get('err_info', '无')}\n"
|
||||
f"学生代码:\n```\n{submission.code[:2000]}\n```"
|
||||
)
|
||||
|
||||
return make_sse_response(
|
||||
stream_ai_response(client, system_prompt, user_prompt)
|
||||
)
|
||||
|
||||
|
||||
class AIHeatmapDataAPI(APIView):
|
||||
@login_required
|
||||
def get(self, request):
|
||||
user = request.user
|
||||
cache_key = get_cache_key("ai_heatmap", user.id, user.class_name or "")
|
||||
cached_result = cache.get(cache_key)
|
||||
if cached_result:
|
||||
return self.success(cached_result)
|
||||
|
||||
end = datetime.now()
|
||||
start = end - timedelta(days=365)
|
||||
|
||||
# 使用单次查询获取所有数据,按日期分组统计
|
||||
submission_counts = (
|
||||
Submission.objects.filter(
|
||||
user_id=user.id, create_time__gte=start, create_time__lte=end
|
||||
)
|
||||
.annotate(date=TruncDate("create_time"))
|
||||
.values("date")
|
||||
.annotate(count=Count("id"))
|
||||
.order_by("date")
|
||||
)
|
||||
|
||||
# 将查询结果转换为字典,便于快速查找
|
||||
submission_dict = {item["date"]: item["count"] for item in submission_counts}
|
||||
|
||||
# 生成365天的热力图数据
|
||||
heatmap_data = []
|
||||
current_date = start.date()
|
||||
for i in range(365):
|
||||
day_date = current_date + timedelta(days=i)
|
||||
submission_count = submission_dict.get(day_date, 0)
|
||||
heatmap_data.append(
|
||||
{
|
||||
"timestamp": int(
|
||||
datetime.combine(day_date, datetime.min.time()).timestamp()
|
||||
* 1000
|
||||
),
|
||||
"value": submission_count,
|
||||
}
|
||||
)
|
||||
|
||||
cache.set(cache_key, heatmap_data, CACHE_TIMEOUT)
|
||||
return self.success(heatmap_data)
|
||||
@@ -1,19 +0,0 @@
|
||||
# Generated by Django 6.0 on 2026-04-23 20:07
|
||||
|
||||
from django.conf import settings
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('announcement', '0001_initial'),
|
||||
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddIndex(
|
||||
model_name='announcement',
|
||||
index=models.Index(fields=['visible', '-top', '-create_time'], name='announcement_list_idx'),
|
||||
),
|
||||
]
|
||||
@@ -18,6 +18,3 @@ class Announcement(models.Model):
|
||||
class Meta:
|
||||
db_table = "announcement"
|
||||
ordering = ("-top", "-create_time",)
|
||||
indexes = [
|
||||
models.Index(fields=["visible", "-top", "-create_time"], name="announcement_list_idx"),
|
||||
]
|
||||
|
||||
48
announcement/tests.py
Normal file
48
announcement/tests.py
Normal file
@@ -0,0 +1,48 @@
|
||||
from utils.api.tests import APITestCase
|
||||
|
||||
from .models import Announcement
|
||||
|
||||
|
||||
class AnnouncementAdminTest(APITestCase):
|
||||
def setUp(self):
|
||||
self.user = self.create_super_admin()
|
||||
self.url = self.reverse("announcement_admin_api")
|
||||
|
||||
def test_announcement_list(self):
|
||||
response = self.client.get(self.url)
|
||||
self.assertSuccess(response)
|
||||
|
||||
def create_announcement(self):
|
||||
return self.client.post(self.url, data={"title": "test", "content": "test", "visible": True})
|
||||
|
||||
def test_create_announcement(self):
|
||||
resp = self.create_announcement()
|
||||
self.assertSuccess(resp)
|
||||
return resp
|
||||
|
||||
def test_edit_announcement(self):
|
||||
data = {"id": self.create_announcement().data["data"]["id"], "title": "ahaha", "content": "test content",
|
||||
"visible": False}
|
||||
resp = self.client.put(self.url, data=data)
|
||||
self.assertSuccess(resp)
|
||||
resp_data = resp.data["data"]
|
||||
self.assertEqual(resp_data["title"], "ahaha")
|
||||
self.assertEqual(resp_data["content"], "test content")
|
||||
self.assertEqual(resp_data["visible"], False)
|
||||
|
||||
def test_delete_announcement(self):
|
||||
id = self.test_create_announcement().data["data"]["id"]
|
||||
resp = self.client.delete(self.url + "?id=" + str(id))
|
||||
self.assertSuccess(resp)
|
||||
self.assertFalse(Announcement.objects.filter(id=id).exists())
|
||||
|
||||
|
||||
class AnnouncementAPITest(APITestCase):
|
||||
def setUp(self):
|
||||
self.user = self.create_super_admin()
|
||||
Announcement.objects.create(title="title", content="content", visible=True, created_by=self.user)
|
||||
self.url = self.reverse("announcement_api")
|
||||
|
||||
def test_get_announcement_list(self):
|
||||
resp = self.client.get(self.url)
|
||||
self.assertSuccess(resp)
|
||||
@@ -1,11 +1,12 @@
|
||||
from account.decorators import super_admin_required
|
||||
from utils.api import APIView, validate_serializer
|
||||
|
||||
from announcement.models import Announcement
|
||||
from announcement.serializers import (
|
||||
AnnouncementSerializer,
|
||||
CreateAnnouncementSerializer,
|
||||
EditAnnouncementSerializer,
|
||||
)
|
||||
from utils.api import APIView, validate_serializer
|
||||
|
||||
|
||||
class AnnouncementAdminAPI(APIView):
|
||||
|
||||
@@ -1,7 +1,8 @@
|
||||
from announcement.models import Announcement
|
||||
from announcement.serializers import AnnouncementListSerializer, AnnouncementSerializer
|
||||
from utils.api import APIView
|
||||
|
||||
from announcement.models import Announcement
|
||||
from announcement.serializers import AnnouncementSerializer, AnnouncementListSerializer
|
||||
|
||||
|
||||
class AnnouncementAPI(APIView):
|
||||
def get(self, request):
|
||||
@@ -13,7 +14,7 @@ class AnnouncementAPI(APIView):
|
||||
except Announcement.DoesNotExist:
|
||||
return self.error("Announcement does not exist")
|
||||
|
||||
announcements = Announcement.objects.select_related("created_by").filter(visible=True)
|
||||
announcements = Announcement.objects.filter(visible=True)
|
||||
return self.success(
|
||||
self.paginate_data(request, announcements, AnnouncementListSerializer)
|
||||
)
|
||||
|
||||
@@ -1,2 +0,0 @@
|
||||
|
||||
# Register your models here.
|
||||
@@ -1,7 +0,0 @@
|
||||
from django.apps import AppConfig
|
||||
|
||||
|
||||
class ClassPkConfig(AppConfig):
|
||||
default_auto_field = 'django.db.models.BigAutoField'
|
||||
name = 'class_pk'
|
||||
verbose_name = '班级PK'
|
||||
@@ -1,2 +0,0 @@
|
||||
# 空文件
|
||||
|
||||
@@ -1,3 +0,0 @@
|
||||
|
||||
# 如果需要存储班级PK历史记录,可以在这里定义模型
|
||||
# 目前暂时不需要,因为都是实时计算
|
||||
@@ -1,3 +0,0 @@
|
||||
# 如果需要序列化器,可以在这里定义
|
||||
# 目前使用APIView的paginate_data方法,暂时不需要
|
||||
|
||||
@@ -1,2 +0,0 @@
|
||||
# 空文件
|
||||
|
||||
@@ -1,10 +0,0 @@
|
||||
from django.urls import path
|
||||
|
||||
from ..views.oj import ClassPKAPI, ClassRankAPI, UserClassRankAPI
|
||||
|
||||
urlpatterns = [
|
||||
path("class_rank", ClassRankAPI.as_view()),
|
||||
path("user_class_rank", UserClassRankAPI.as_view()),
|
||||
path("class_pk", ClassPKAPI.as_view()),
|
||||
]
|
||||
|
||||
@@ -1,345 +0,0 @@
|
||||
import statistics
|
||||
from datetime import datetime
|
||||
|
||||
from django.db.models import Avg, Sum
|
||||
from django.utils import timezone
|
||||
|
||||
from account.decorators import login_required
|
||||
from account.models import AdminType, User, UserProfile
|
||||
from submission.models import JudgeStatus, Submission
|
||||
from utils.api import APIView
|
||||
|
||||
|
||||
class ClassRankAPI(APIView):
|
||||
"""获取班级排名列表"""
|
||||
|
||||
def get(self, request):
|
||||
# 获取年级参数
|
||||
grade = int(request.GET.get("grade"))
|
||||
# 获取所有有用户的班级
|
||||
classes = (
|
||||
User.objects.filter(
|
||||
class_name__isnull=False,
|
||||
is_disabled=False,
|
||||
admin_type__in=[AdminType.REGULAR_USER, AdminType.ADMIN],
|
||||
class_name__startswith=str(grade),
|
||||
)
|
||||
.values("class_name")
|
||||
.distinct()
|
||||
)
|
||||
|
||||
class_stats = []
|
||||
for class_info in classes:
|
||||
class_name = class_info["class_name"]
|
||||
users = User.objects.filter(
|
||||
class_name=class_name,
|
||||
is_disabled=False,
|
||||
admin_type__in=[AdminType.REGULAR_USER, AdminType.ADMIN],
|
||||
)
|
||||
user_ids = list(users.values_list("id", flat=True))
|
||||
|
||||
profiles = UserProfile.objects.filter(user_id__in=user_ids)
|
||||
|
||||
total_ac = profiles.aggregate(total=Sum("accepted_number"))["total"] or 0
|
||||
total_submission = (
|
||||
profiles.aggregate(total=Sum("submission_number"))["total"] or 0
|
||||
)
|
||||
avg_ac = profiles.aggregate(avg=Avg("accepted_number"))["avg"] or 0
|
||||
|
||||
user_count = users.count()
|
||||
|
||||
class_stats.append(
|
||||
{
|
||||
"class_name": class_name,
|
||||
"user_count": user_count,
|
||||
"total_ac": int(total_ac),
|
||||
"total_submission": int(total_submission),
|
||||
"avg_ac": round(avg_ac, 2),
|
||||
"ac_rate": round(total_ac / total_submission * 100, 2)
|
||||
if total_submission > 0
|
||||
else 0,
|
||||
}
|
||||
)
|
||||
|
||||
# 按总AC数排序
|
||||
class_stats.sort(key=lambda x: (-x["total_ac"], x["total_submission"]))
|
||||
|
||||
# 添加排名
|
||||
for i, stat in enumerate(class_stats):
|
||||
stat["rank"] = i + 1
|
||||
|
||||
return self.success(class_stats)
|
||||
|
||||
|
||||
class UserClassRankAPI(APIView):
|
||||
"""获取用户在班级中的排名"""
|
||||
|
||||
@login_required
|
||||
def get(self, request):
|
||||
user = request.user
|
||||
if not user.class_name:
|
||||
return self.error("用户没有班级信息")
|
||||
scope = request.GET.get("scope", "").lower()
|
||||
show_all = scope == "all"
|
||||
try:
|
||||
limit = int(request.GET.get("limit", "10"))
|
||||
except ValueError:
|
||||
limit = 10
|
||||
if limit <= 0 or limit > 250:
|
||||
limit = 10
|
||||
try:
|
||||
offset = int(request.GET.get("offset", "0"))
|
||||
except ValueError:
|
||||
offset = 0
|
||||
if offset < 0:
|
||||
offset = 0
|
||||
|
||||
# 获取同班所有用户
|
||||
class_users = User.objects.filter(
|
||||
class_name=user.class_name,
|
||||
is_disabled=False,
|
||||
admin_type__in=[AdminType.REGULAR_USER, AdminType.ADMIN],
|
||||
).select_related("userprofile")
|
||||
|
||||
user_ranks = []
|
||||
for class_user in class_users:
|
||||
profile = class_user.userprofile
|
||||
user_ranks.append(
|
||||
{
|
||||
"user_id": class_user.id,
|
||||
"username": class_user.username,
|
||||
"accepted_number": profile.accepted_number,
|
||||
"submission_number": profile.submission_number,
|
||||
}
|
||||
)
|
||||
|
||||
# 按AC数排序
|
||||
user_ranks.sort(key=lambda x: (-x["accepted_number"], x["submission_number"]))
|
||||
|
||||
# 添加排名
|
||||
my_rank = -1
|
||||
for i, rank_info in enumerate(user_ranks):
|
||||
rank_info["rank"] = i + 1
|
||||
if rank_info["user_id"] == user.id:
|
||||
my_rank = i + 1
|
||||
|
||||
trimmed_ranks = user_ranks
|
||||
if not show_all and my_rank > 0 and len(user_ranks) > 10:
|
||||
center_index = my_rank - 1
|
||||
start = max(0, center_index - 5)
|
||||
end = start + 10
|
||||
if end > len(user_ranks):
|
||||
end = len(user_ranks)
|
||||
start = max(0, end - 10)
|
||||
trimmed_ranks = user_ranks[start:end]
|
||||
elif show_all:
|
||||
trimmed_ranks = user_ranks[offset : offset + limit]
|
||||
|
||||
return self.success(
|
||||
{
|
||||
"class_name": user.class_name,
|
||||
"my_rank": my_rank,
|
||||
"total": len(user_ranks),
|
||||
"ranks": trimmed_ranks,
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
class ClassPKAPI(APIView):
|
||||
"""班级PK比较 - 多维度教育评价"""
|
||||
|
||||
def post(self, request):
|
||||
class_names = request.data.get("class_name", [])
|
||||
if not class_names or len(class_names) < 2:
|
||||
return self.error("至少需要选择2个班级进行比较")
|
||||
|
||||
# 获取时间段参数
|
||||
start_time = request.data.get("start_time")
|
||||
end_time = request.data.get("end_time")
|
||||
|
||||
# 将时间字符串转换为datetime对象
|
||||
# 处理空字符串、None 或 undefined 的情况
|
||||
if start_time and isinstance(start_time, str) and start_time.strip():
|
||||
try:
|
||||
start_time = datetime.fromisoformat(start_time.replace("Z", "+00:00"))
|
||||
if timezone.is_naive(start_time):
|
||||
start_time = timezone.make_aware(start_time)
|
||||
except (ValueError, AttributeError):
|
||||
start_time = None
|
||||
else:
|
||||
start_time = None
|
||||
|
||||
if end_time and isinstance(end_time, str) and end_time.strip():
|
||||
try:
|
||||
end_time = datetime.fromisoformat(end_time.replace("Z", "+00:00"))
|
||||
if timezone.is_naive(end_time):
|
||||
end_time = timezone.make_aware(end_time)
|
||||
except (ValueError, AttributeError):
|
||||
end_time = None
|
||||
else:
|
||||
end_time = None
|
||||
|
||||
class_comparisons = []
|
||||
|
||||
for class_name in class_names:
|
||||
users = User.objects.filter(
|
||||
class_name=class_name,
|
||||
is_disabled=False,
|
||||
admin_type__in=[AdminType.REGULAR_USER, AdminType.ADMIN],
|
||||
)
|
||||
user_ids = list(users.values_list("id", flat=True))
|
||||
|
||||
# 获取所有学生的AC数列表(用于统计计算)
|
||||
profiles = UserProfile.objects.filter(user_id__in=user_ids)
|
||||
ac_list = sorted([p.accepted_number for p in profiles], reverse=True)
|
||||
submission_list = sorted(
|
||||
[p.submission_number for p in profiles], reverse=True
|
||||
)
|
||||
|
||||
user_count = len(ac_list)
|
||||
if user_count == 0:
|
||||
continue
|
||||
|
||||
# 基础统计
|
||||
total_ac = sum(ac_list)
|
||||
total_submission = sum(submission_list)
|
||||
avg_ac = statistics.mean(ac_list) if ac_list else 0
|
||||
|
||||
# 中位数和分位数
|
||||
median_ac = statistics.median(ac_list) if ac_list else 0
|
||||
q1_ac = statistics.quantiles(ac_list, n=4)[0] if len(ac_list) > 1 else 0
|
||||
q3_ac = statistics.quantiles(ac_list, n=4)[2] if len(ac_list) > 1 else 0
|
||||
iqr = q3_ac - q1_ac
|
||||
|
||||
# 标准差
|
||||
std_dev = statistics.stdev(ac_list) if len(ac_list) > 1 else 0
|
||||
|
||||
# 前10名和后10名统计
|
||||
top_10_count = min(10, user_count)
|
||||
bottom_10_count = min(10, user_count)
|
||||
top_10_avg = (
|
||||
statistics.mean(ac_list[:top_10_count]) if top_10_count > 0 else 0
|
||||
)
|
||||
bottom_10_avg = (
|
||||
statistics.mean(ac_list[-bottom_10_count:])
|
||||
if bottom_10_count > 0
|
||||
else 0
|
||||
)
|
||||
|
||||
# 前25%和后25%统计
|
||||
top_25_count = max(1, user_count // 4)
|
||||
bottom_25_count = max(1, user_count // 4)
|
||||
top_25_avg = (
|
||||
statistics.mean(ac_list[:top_25_count]) if top_25_count > 0 else 0
|
||||
)
|
||||
bottom_25_avg = (
|
||||
statistics.mean(ac_list[-bottom_25_count:])
|
||||
if bottom_25_count > 0
|
||||
else 0
|
||||
)
|
||||
|
||||
# 优秀率(AC数 >= 中位数 + 标准差)
|
||||
# 使用中位数+标准差方法,既不受极端值影响,又能反映班级差异
|
||||
excellent_threshold = (
|
||||
median_ac + std_dev if std_dev > 0 else median_ac * 1.5
|
||||
)
|
||||
excellent_count = sum(1 for ac in ac_list if ac >= excellent_threshold)
|
||||
excellent_rate = (
|
||||
(excellent_count / user_count * 100) if user_count > 0 else 0
|
||||
)
|
||||
|
||||
# 及格率(AC数 >= 平均值的0.5倍)
|
||||
pass_threshold = avg_ac * 0.5
|
||||
pass_count = sum(1 for ac in ac_list if ac >= pass_threshold)
|
||||
pass_rate = (pass_count / user_count * 100) if user_count > 0 else 0
|
||||
|
||||
# 参与度(有提交记录的学生比例)
|
||||
active_count = sum(1 for sub in submission_list if sub > 0)
|
||||
active_rate = (active_count / user_count * 100) if user_count > 0 else 0
|
||||
|
||||
# 时间段内的统计(如果提供了时间段)
|
||||
recent_stats = {}
|
||||
if start_time and end_time:
|
||||
submissions = Submission.objects.filter(
|
||||
user_id__in=user_ids,
|
||||
create_time__gte=start_time,
|
||||
create_time__lte=end_time,
|
||||
)
|
||||
recent_ac = (
|
||||
submissions.filter(result=JudgeStatus.ACCEPTED)
|
||||
.values("user_id", "problem_id")
|
||||
.distinct()
|
||||
.count()
|
||||
)
|
||||
recent_submission = submissions.count()
|
||||
|
||||
# 时间段内的用户AC数列表
|
||||
recent_user_ac = {}
|
||||
for user_id in user_ids:
|
||||
user_recent_ac = (
|
||||
submissions.filter(user_id=user_id, result=JudgeStatus.ACCEPTED)
|
||||
.values("problem_id")
|
||||
.distinct()
|
||||
.count()
|
||||
)
|
||||
recent_user_ac[user_id] = user_recent_ac
|
||||
|
||||
recent_ac_list = sorted(recent_user_ac.values(), reverse=True)
|
||||
if recent_ac_list:
|
||||
recent_stats = {
|
||||
"recent_total_ac": recent_ac,
|
||||
"recent_total_submission": recent_submission,
|
||||
"recent_avg_ac": statistics.mean(recent_ac_list),
|
||||
"recent_median_ac": statistics.median(recent_ac_list),
|
||||
"recent_top_10_avg": statistics.mean(
|
||||
recent_ac_list[: min(10, len(recent_ac_list))]
|
||||
)
|
||||
if recent_ac_list
|
||||
else 0,
|
||||
"recent_active_count": sum(
|
||||
1 for ac in recent_ac_list if ac > 0
|
||||
),
|
||||
}
|
||||
|
||||
class_comparisons.append(
|
||||
{
|
||||
"class_name": class_name,
|
||||
"user_count": user_count,
|
||||
# 基础统计
|
||||
"total_ac": int(total_ac),
|
||||
"total_submission": int(total_submission),
|
||||
"avg_ac": round(avg_ac, 2),
|
||||
# 中位数和分位数
|
||||
"median_ac": round(median_ac, 2),
|
||||
"q1_ac": round(q1_ac, 2),
|
||||
"q3_ac": round(q3_ac, 2),
|
||||
"iqr": round(iqr, 2),
|
||||
# 标准差
|
||||
"std_dev": round(std_dev, 2),
|
||||
# 分层统计
|
||||
"top_10_avg": round(top_10_avg, 2),
|
||||
"bottom_10_avg": round(bottom_10_avg, 2),
|
||||
"top_25_avg": round(top_25_avg, 2),
|
||||
"bottom_25_avg": round(bottom_25_avg, 2),
|
||||
# 比率统计
|
||||
"excellent_rate": round(excellent_rate, 2),
|
||||
"pass_rate": round(pass_rate, 2),
|
||||
"active_rate": round(active_rate, 2),
|
||||
# 正确率
|
||||
"ac_rate": round(total_ac / total_submission * 100, 2)
|
||||
if total_submission > 0
|
||||
else 0,
|
||||
# 时间段统计(如果有)
|
||||
**recent_stats,
|
||||
}
|
||||
)
|
||||
|
||||
# 按总AC数排序
|
||||
class_comparisons.sort(key=lambda x: (-x["total_ac"], x["total_submission"]))
|
||||
|
||||
return self.success(
|
||||
{
|
||||
"comparisons": class_comparisons,
|
||||
"has_time_range": bool(start_time and end_time),
|
||||
}
|
||||
)
|
||||
@@ -1,21 +0,0 @@
|
||||
# Generated by Django 6.0 on 2026-04-23 20:07
|
||||
|
||||
from django.conf import settings
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('comment', '0001_initial'),
|
||||
('problem', '0007_problem_problem_visible_idx'),
|
||||
('submission', '0004_submission_problem_user_idx'),
|
||||
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddIndex(
|
||||
model_name='comment',
|
||||
index=models.Index(fields=['problem', 'create_time'], name='comment_problem_time_idx'),
|
||||
),
|
||||
]
|
||||
@@ -40,8 +40,5 @@ class Comment(models.Model):
|
||||
class Meta:
|
||||
db_table = "comment"
|
||||
ordering = ("-create_time",)
|
||||
indexes = [
|
||||
models.Index(fields=["problem", "create_time"], name="comment_problem_time_idx"),
|
||||
]
|
||||
|
||||
|
||||
|
||||
@@ -2,6 +2,7 @@ from django.urls import path
|
||||
|
||||
from ..views.admin import CommentAPI
|
||||
|
||||
|
||||
urlpatterns = [
|
||||
path("comment", CommentAPI.as_view()),
|
||||
]
|
||||
|
||||
@@ -2,6 +2,7 @@ from django.urls import path
|
||||
|
||||
from ..views.oj import CommentAPI, CommentStatisticsAPI
|
||||
|
||||
|
||||
urlpatterns = [
|
||||
path("comment", CommentAPI.as_view()),
|
||||
path("comment/statistics", CommentStatisticsAPI.as_view()),
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
from account.decorators import super_admin_required
|
||||
from comment.models import Comment
|
||||
from comment.serializers import CommentListSerializer
|
||||
from problem.models import Problem
|
||||
from utils.api import APIView
|
||||
from comment.models import Comment
|
||||
|
||||
|
||||
class CommentAPI(APIView):
|
||||
|
||||
@@ -1,15 +1,12 @@
|
||||
from django.core.cache import cache
|
||||
from django.db.models import Avg, Count
|
||||
from django.db.models import Avg
|
||||
from django.db.models.functions import Round
|
||||
|
||||
from account.decorators import login_required
|
||||
from comment.models import Comment
|
||||
from comment.serializers import CommentSerializer, CreateCommentSerializer
|
||||
from problem.models import Problem
|
||||
from submission.models import JudgeStatus, Submission
|
||||
from utils.api import APIView
|
||||
from account.decorators import login_required
|
||||
from utils.api.api import validate_serializer
|
||||
from utils.constants import CacheKey
|
||||
from comment.serializers import CreateCommentSerializer, CommentSerializer
|
||||
from submission.models import Submission, JudgeStatus
|
||||
|
||||
|
||||
class CommentAPI(APIView):
|
||||
@@ -49,7 +46,6 @@ class CommentAPI(APIView):
|
||||
comprehensive_rating=data["comprehensive_rating"],
|
||||
content=data["content"],
|
||||
)
|
||||
cache.delete(f"{CacheKey.comment_stats}:{problem.id}")
|
||||
return self.success()
|
||||
|
||||
@login_required
|
||||
@@ -69,24 +65,16 @@ class CommentAPI(APIView):
|
||||
class CommentStatisticsAPI(APIView):
|
||||
def get(self, request):
|
||||
problem_id = request.GET.get("problem_id")
|
||||
cache_key = f"{CacheKey.comment_stats}:{problem_id}"
|
||||
cached = cache.get(cache_key)
|
||||
if cached is not None:
|
||||
return self.success(cached)
|
||||
comments = Comment.objects.select_related("problem").filter(
|
||||
problem_id=problem_id
|
||||
)
|
||||
if comments.count() == 0:
|
||||
return self.success()
|
||||
|
||||
agg = Comment.objects.filter(problem_id=problem_id).aggregate(
|
||||
count=Count("id"),
|
||||
count = comments.count()
|
||||
rating = comments.aggregate(
|
||||
description=Round(Avg("description_rating"), 2),
|
||||
difficulty=Round(Avg("difficulty_rating"), 2),
|
||||
comprehensive=Round(Avg("comprehensive_rating"), 2),
|
||||
)
|
||||
if not agg["count"]:
|
||||
return self.success()
|
||||
|
||||
data = {"count": agg["count"], "rating": {
|
||||
"description": agg["description"],
|
||||
"difficulty": agg["difficulty"],
|
||||
"comprehensive": agg["comprehensive"],
|
||||
}}
|
||||
cache.set(cache_key, data, 3600)
|
||||
return self.success(data)
|
||||
return self.success({"count": count, "rating": rating})
|
||||
|
||||
@@ -1,98 +0,0 @@
|
||||
"""
|
||||
WebSocket consumers for configuration updates
|
||||
"""
|
||||
import json
|
||||
import logging
|
||||
|
||||
from channels.generic.websocket import AsyncWebsocketConsumer
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class ConfigConsumer(AsyncWebsocketConsumer):
|
||||
"""
|
||||
WebSocket consumer for real-time configuration updates
|
||||
当管理员修改配置后,通过 WebSocket 实时推送配置变化
|
||||
"""
|
||||
|
||||
async def connect(self):
|
||||
"""处理 WebSocket 连接"""
|
||||
self.user = self.scope["user"]
|
||||
|
||||
# 只允许认证用户连接
|
||||
if not self.user.is_authenticated:
|
||||
await self.close()
|
||||
return
|
||||
|
||||
# 使用全局配置组名,所有用户都能接收配置更新
|
||||
self.group_name = "config_updates"
|
||||
|
||||
# 加入配置更新组
|
||||
await self.channel_layer.group_add(
|
||||
self.group_name,
|
||||
self.channel_name
|
||||
)
|
||||
|
||||
await self.accept()
|
||||
logger.info(f"Config WebSocket connected: user_id={self.user.id}, channel={self.channel_name}")
|
||||
|
||||
async def disconnect(self, close_code):
|
||||
"""处理 WebSocket 断开连接"""
|
||||
if hasattr(self, 'group_name'):
|
||||
await self.channel_layer.group_discard(
|
||||
self.group_name,
|
||||
self.channel_name
|
||||
)
|
||||
logger.info(f"Config WebSocket disconnected: user_id={self.user.id}, close_code={close_code}")
|
||||
|
||||
async def receive(self, text_data):
|
||||
"""
|
||||
接收客户端消息
|
||||
客户端可以发送心跳包或配置更新请求
|
||||
"""
|
||||
try:
|
||||
data = json.loads(text_data)
|
||||
message_type = data.get("type")
|
||||
|
||||
if message_type == "ping":
|
||||
# 响应心跳包
|
||||
await self.send(text_data=json.dumps({
|
||||
"type": "pong",
|
||||
"timestamp": data.get("timestamp")
|
||||
}))
|
||||
elif message_type == "config_update":
|
||||
# 处理配置更新请求
|
||||
key = data.get("key")
|
||||
value = data.get("value")
|
||||
if key and value is not None:
|
||||
logger.info(f"User {self.user.id} requested config update: {key}={value}")
|
||||
# 这里可以添加权限检查,只有管理员才能发送配置更新
|
||||
if self.user.is_superuser:
|
||||
# 广播配置更新给所有连接的客户端
|
||||
await self.channel_layer.group_send(
|
||||
self.group_name,
|
||||
{
|
||||
"type": "config_update",
|
||||
"data": {
|
||||
"type": "config_update",
|
||||
"key": key,
|
||||
"value": value
|
||||
}
|
||||
}
|
||||
)
|
||||
except json.JSONDecodeError:
|
||||
logger.error(f"Invalid JSON received from user {self.user.id}")
|
||||
except Exception as e:
|
||||
logger.error(f"Error handling message from user {self.user.id}: {str(e)}")
|
||||
|
||||
async def config_update(self, event):
|
||||
"""
|
||||
接收来自 channel layer 的配置更新消息并发送给客户端
|
||||
这个方法名对应 group_send 中的 type 字段
|
||||
"""
|
||||
try:
|
||||
# 从 event 中提取数据并发送给客户端
|
||||
await self.send(text_data=json.dumps(event["data"]))
|
||||
logger.debug(f"Sent config update to user {self.user.id}: {event['data']}")
|
||||
except Exception as e:
|
||||
logger.error(f"Error sending config update to user {self.user.id}: {str(e)}")
|
||||
@@ -27,7 +27,6 @@ class CreateEditWebsiteConfigSerializer(serializers.Serializer):
|
||||
allow_register = serializers.BooleanField()
|
||||
submission_list_show_all = serializers.BooleanField()
|
||||
class_list = serializers.ListField(child=serializers.CharField(max_length=64))
|
||||
enable_maxkb = serializers.BooleanField()
|
||||
|
||||
|
||||
class JudgeServerSerializer(serializers.ModelSerializer):
|
||||
|
||||
185
conf/tests.py
Normal file
185
conf/tests.py
Normal file
@@ -0,0 +1,185 @@
|
||||
import hashlib
|
||||
from unittest import mock
|
||||
|
||||
from django.conf import settings
|
||||
from django.utils import timezone
|
||||
|
||||
from options.options import SysOptions
|
||||
from utils.api.tests import APITestCase
|
||||
from .models import JudgeServer
|
||||
|
||||
|
||||
class SMTPConfigTest(APITestCase):
|
||||
def setUp(self):
|
||||
self.user = self.create_super_admin()
|
||||
self.url = self.reverse("smtp_admin_api")
|
||||
self.password = "testtest"
|
||||
|
||||
def test_create_smtp_config(self):
|
||||
data = {"server": "smtp.test.com", "email": "test@test.com", "port": 465,
|
||||
"tls": True, "password": self.password}
|
||||
resp = self.client.post(self.url, data=data)
|
||||
self.assertSuccess(resp)
|
||||
self.assertTrue("password" not in resp.data)
|
||||
return resp
|
||||
|
||||
def test_edit_without_password(self):
|
||||
self.test_create_smtp_config()
|
||||
data = {"server": "smtp1.test.com", "email": "test2@test.com", "port": 465,
|
||||
"tls": True}
|
||||
resp = self.client.put(self.url, data=data)
|
||||
self.assertSuccess(resp)
|
||||
|
||||
def test_edit_without_password1(self):
|
||||
self.test_create_smtp_config()
|
||||
data = {"server": "smtp.test.com", "email": "test@test.com", "port": 465,
|
||||
"tls": True, "password": ""}
|
||||
resp = self.client.put(self.url, data=data)
|
||||
self.assertSuccess(resp)
|
||||
|
||||
def test_edit_with_password(self):
|
||||
self.test_create_smtp_config()
|
||||
data = {"server": "smtp1.test.com", "email": "test2@test.com", "port": 465,
|
||||
"tls": True, "password": "newpassword"}
|
||||
resp = self.client.put(self.url, data=data)
|
||||
self.assertSuccess(resp)
|
||||
|
||||
@mock.patch("conf.views.send_email")
|
||||
def test_test_smtp(self, mocked_send_email):
|
||||
url = self.reverse("smtp_test_api")
|
||||
self.test_create_smtp_config()
|
||||
resp = self.client.post(url, data={"email": "test@test.com"})
|
||||
self.assertSuccess(resp)
|
||||
mocked_send_email.assert_called_once()
|
||||
|
||||
|
||||
class WebsiteConfigAPITest(APITestCase):
|
||||
def test_create_website_config(self):
|
||||
self.create_super_admin()
|
||||
url = self.reverse("website_config_api")
|
||||
data = {"website_base_url": "http://test.com", "website_name": "test name",
|
||||
"website_name_shortcut": "test oj", "website_footer": "<a>test</a>",
|
||||
"allow_register": True, "submission_list_show_all": False}
|
||||
resp = self.client.post(url, data=data)
|
||||
self.assertSuccess(resp)
|
||||
|
||||
def test_edit_website_config(self):
|
||||
self.create_super_admin()
|
||||
url = self.reverse("website_config_api")
|
||||
data = {"website_base_url": "http://test.com", "website_name": "test name",
|
||||
"website_name_shortcut": "test oj", "website_footer": "<img onerror=alert(1) src=#>",
|
||||
"allow_register": True, "submission_list_show_all": False}
|
||||
resp = self.client.post(url, data=data)
|
||||
self.assertSuccess(resp)
|
||||
self.assertEqual(SysOptions.website_footer, '<img src="#" />')
|
||||
|
||||
def test_get_website_config(self):
|
||||
# do not need to login
|
||||
url = self.reverse("website_info_api")
|
||||
resp = self.client.get(url)
|
||||
self.assertSuccess(resp)
|
||||
|
||||
|
||||
class JudgeServerHeartbeatTest(APITestCase):
|
||||
def setUp(self):
|
||||
self.url = self.reverse("judge_server_heartbeat_api")
|
||||
self.data = {"hostname": "testhostname", "judger_version": "1.0.4", "cpu_core": 4,
|
||||
"cpu": 90.5, "memory": 80.3, "action": "heartbeat", "service_url": "http://127.0.0.1"}
|
||||
self.token = "test"
|
||||
self.hashed_token = hashlib.sha256(self.token.encode("utf-8")).hexdigest()
|
||||
SysOptions.judge_server_token = self.token
|
||||
self.headers = {"HTTP_X_JUDGE_SERVER_TOKEN": self.hashed_token, settings.IP_HEADER: "1.2.3.4"}
|
||||
|
||||
def test_new_heartbeat(self):
|
||||
resp = self.client.post(self.url, data=self.data, **self.headers)
|
||||
self.assertSuccess(resp)
|
||||
server = JudgeServer.objects.first()
|
||||
self.assertEqual(server.ip, "127.0.0.1")
|
||||
|
||||
def test_update_heartbeat(self):
|
||||
self.test_new_heartbeat()
|
||||
data = self.data
|
||||
data["judger_version"] = "2.0.0"
|
||||
resp = self.client.post(self.url, data=data, **self.headers)
|
||||
self.assertSuccess(resp)
|
||||
self.assertEqual(JudgeServer.objects.get(hostname=self.data["hostname"]).judger_version, data["judger_version"])
|
||||
|
||||
|
||||
class JudgeServerAPITest(APITestCase):
|
||||
def setUp(self):
|
||||
self.server = JudgeServer.objects.create(**{"hostname": "testhostname", "judger_version": "1.0.4",
|
||||
"cpu_core": 4, "cpu_usage": 90.5, "memory_usage": 80.3,
|
||||
"last_heartbeat": timezone.now()})
|
||||
self.url = self.reverse("judge_server_api")
|
||||
self.create_super_admin()
|
||||
|
||||
def test_get_judge_server(self):
|
||||
resp = self.client.get(self.url)
|
||||
self.assertSuccess(resp)
|
||||
self.assertEqual(len(resp.data["data"]["servers"]), 1)
|
||||
|
||||
def test_delete_judge_server(self):
|
||||
resp = self.client.delete(self.url + "?hostname=testhostname")
|
||||
self.assertSuccess(resp)
|
||||
self.assertFalse(JudgeServer.objects.filter(hostname="testhostname").exists())
|
||||
|
||||
def test_disabled_judge_server(self):
|
||||
resp = self.client.put(self.url, data={"is_disabled": True, "id": self.server.id})
|
||||
self.assertSuccess(resp)
|
||||
self.assertTrue(JudgeServer.objects.get(id=self.server.id).is_disabled)
|
||||
|
||||
|
||||
class LanguageListAPITest(APITestCase):
|
||||
def test_get_languages(self):
|
||||
resp = self.client.get(self.reverse("language_list_api"))
|
||||
self.assertSuccess(resp)
|
||||
|
||||
|
||||
class TestCasePruneAPITest(APITestCase):
|
||||
def setUp(self):
|
||||
self.url = self.reverse("prune_test_case_api")
|
||||
self.create_super_admin()
|
||||
|
||||
def test_get_isolated_test_case(self):
|
||||
resp = self.client.get(self.url)
|
||||
self.assertSuccess(resp)
|
||||
|
||||
@mock.patch("conf.views.TestCasePruneAPI.delete_one")
|
||||
@mock.patch("conf.views.os.listdir")
|
||||
@mock.patch("conf.views.Problem")
|
||||
def test_delete_test_case(self, mocked_problem, mocked_listdir, mocked_delete_one):
|
||||
valid_id = "1172980672983b2b49820be3a741b109"
|
||||
mocked_problem.return_value = [valid_id, ]
|
||||
mocked_listdir.return_value = [valid_id, ".test", "aaa"]
|
||||
resp = self.client.delete(self.url)
|
||||
self.assertSuccess(resp)
|
||||
mocked_delete_one.assert_called_once_with(valid_id)
|
||||
|
||||
|
||||
class ReleaseNoteAPITest(APITestCase):
|
||||
def setUp(self):
|
||||
self.url = self.reverse("get_release_notes_api")
|
||||
self.create_super_admin()
|
||||
self.latest_data = {"update": [
|
||||
{
|
||||
"version": "2099-12-25",
|
||||
"level": 1,
|
||||
"title": "Update at 2099-12-25",
|
||||
"details": ["test get", ]
|
||||
}
|
||||
]}
|
||||
|
||||
def test_get_versions(self):
|
||||
resp = self.client.get(self.url)
|
||||
self.assertSuccess(resp)
|
||||
|
||||
|
||||
class DashboardInfoAPITest(APITestCase):
|
||||
def setUp(self):
|
||||
self.url = self.reverse("dashboard_info_api")
|
||||
self.create_admin()
|
||||
|
||||
def test_get_info(self):
|
||||
resp = self.client.get(self.url)
|
||||
self.assertSuccess(resp)
|
||||
self.assertEqual(resp.data["data"]["user_count"], 1)
|
||||
@@ -2,13 +2,13 @@ from django.urls import path
|
||||
|
||||
from ..views import (
|
||||
SMTPAPI,
|
||||
DashboardInfoAPI,
|
||||
JudgeServerAPI,
|
||||
RandomUsernameAPI,
|
||||
ReleaseNotesAPI,
|
||||
SMTPTestAPI,
|
||||
TestCasePruneAPI,
|
||||
WebsiteConfigAPI,
|
||||
TestCasePruneAPI,
|
||||
SMTPTestAPI,
|
||||
ReleaseNotesAPI,
|
||||
DashboardInfoAPI,
|
||||
RandomUsernameAPI,
|
||||
)
|
||||
|
||||
urlpatterns = [
|
||||
|
||||
@@ -1,12 +1,6 @@
|
||||
from django.urls import path
|
||||
|
||||
from ..views import (
|
||||
ClassUsernamesAPI,
|
||||
HitokotoAPI,
|
||||
JudgeServerHeartbeatAPI,
|
||||
LanguagesAPI,
|
||||
WebsiteConfigAPI,
|
||||
)
|
||||
from ..views import HitokotoAPI, JudgeServerHeartbeatAPI, LanguagesAPI, WebsiteConfigAPI
|
||||
|
||||
urlpatterns = [
|
||||
path("website", WebsiteConfigAPI.as_view()),
|
||||
@@ -14,5 +8,4 @@ urlpatterns = [
|
||||
path("judge_server_heartbeat/", JudgeServerHeartbeatAPI.as_view()),
|
||||
path("languages", LanguagesAPI.as_view()),
|
||||
path("hitokoto", HitokotoAPI.as_view()),
|
||||
path("class_usernames", ClassUsernamesAPI.as_view()),
|
||||
]
|
||||
|
||||
@@ -22,19 +22,17 @@ from problem.models import Problem
|
||||
from submission.models import Submission
|
||||
from utils.api import APIView, CSRFExemptAPIView, validate_serializer
|
||||
from utils.cache import JsonDataLoader
|
||||
from utils.shortcuts import get_env, send_email
|
||||
from utils.websocket import push_config_update
|
||||
from utils.shortcuts import send_email, get_env
|
||||
from utils.xss_filter import XSSHtml
|
||||
|
||||
from .models import JudgeServer
|
||||
from .serializers import (
|
||||
CreateEditWebsiteConfigSerializer,
|
||||
CreateSMTPConfigSerializer,
|
||||
EditJudgeServerSerializer,
|
||||
EditSMTPConfigSerializer,
|
||||
JudgeServerHeartbeatSerializer,
|
||||
JudgeServerSerializer,
|
||||
TestSMTPConfigSerializer,
|
||||
EditJudgeServerSerializer,
|
||||
)
|
||||
|
||||
|
||||
@@ -109,7 +107,6 @@ class WebsiteConfigAPI(APIView):
|
||||
"allow_register",
|
||||
"submission_list_show_all",
|
||||
"class_list",
|
||||
"enable_maxkb",
|
||||
]
|
||||
}
|
||||
return self.success(ret)
|
||||
@@ -122,10 +119,6 @@ class WebsiteConfigAPI(APIView):
|
||||
with XSSHtml() as parser:
|
||||
v = parser.clean(v)
|
||||
setattr(SysOptions, k, v)
|
||||
|
||||
# 推送配置更新到所有连接的客户端
|
||||
push_config_update(k, v)
|
||||
|
||||
return self.success()
|
||||
|
||||
|
||||
@@ -211,6 +204,7 @@ class LanguagesAPI(APIView):
|
||||
return self.success(
|
||||
{
|
||||
"languages": SysOptions.languages,
|
||||
"spj_languages": SysOptions.spj_languages,
|
||||
}
|
||||
)
|
||||
|
||||
@@ -316,32 +310,8 @@ class RandomUsernameAPI(APIView):
|
||||
|
||||
class HitokotoAPI(APIView):
|
||||
def get(self, request):
|
||||
try:
|
||||
categories = JsonDataLoader.load_data(
|
||||
settings.HITOKOTO_DIR, "categories.json"
|
||||
)
|
||||
path = random.choice(categories).get("path")
|
||||
sentences = JsonDataLoader.load_data(settings.HITOKOTO_DIR, path)
|
||||
sentence = random.choice(sentences)
|
||||
return self.success(sentence)
|
||||
except Exception:
|
||||
return self.error("获取一言失败,请稍后再试")
|
||||
|
||||
|
||||
class ClassUsernamesAPI(APIView):
|
||||
def get(self, request):
|
||||
classroom = request.GET.get("classroom", "")
|
||||
if not classroom:
|
||||
return self.error("需要班级号")
|
||||
users = User.objects.filter(class_name=classroom).order_by("-create_time")
|
||||
names = []
|
||||
for user in users:
|
||||
prefix = f"ks{classroom}"
|
||||
result = (
|
||||
user.username[len(prefix) :]
|
||||
if user.username.startswith(prefix)
|
||||
else user.username
|
||||
)
|
||||
names.append(result)
|
||||
|
||||
return self.success(names)
|
||||
categories = JsonDataLoader.load_data(settings.HITOKOTO_DIR, "categories.json")
|
||||
path = random.choice(categories).get("path")
|
||||
sentences = JsonDataLoader.load_data(settings.HITOKOTO_DIR, path)
|
||||
sentence = random.choice(sentences)
|
||||
return self.success(sentence)
|
||||
|
||||
@@ -1,23 +0,0 @@
|
||||
# Generated by Django 6.0 on 2026-03-30 15:28
|
||||
|
||||
from django.conf import settings
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('contest', '0001_initial'),
|
||||
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddIndex(
|
||||
model_name='acmcontestrank',
|
||||
index=models.Index(fields=['contest', 'accepted_number', 'total_time'], name='acm_rank_order_idx'),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='oicontestrank',
|
||||
index=models.Index(fields=['contest', 'total_score'], name='oi_rank_order_idx'),
|
||||
),
|
||||
]
|
||||
@@ -1,23 +0,0 @@
|
||||
# Generated by Django 6.0 on 2026-04-23 20:07
|
||||
|
||||
from django.conf import settings
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('contest', '0002_acmcontestrank_acm_rank_order_idx_and_more'),
|
||||
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddIndex(
|
||||
model_name='acmcontestrank',
|
||||
index=models.Index(fields=['contest', 'user'], name='acm_rank_contest_user_idx'),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='oicontestrank',
|
||||
index=models.Index(fields=['contest', 'user'], name='oi_rank_contest_user_idx'),
|
||||
),
|
||||
]
|
||||
@@ -46,13 +46,10 @@ class Contest(models.Model):
|
||||
|
||||
# 是否有权查看problem 的一些统计信息 诸如submission_number, accepted_number 等
|
||||
def problem_details_permission(self, user):
|
||||
return (
|
||||
self.rule_type == ContestRuleType.ACM
|
||||
or self.status == ContestStatus.CONTEST_ENDED
|
||||
or user.is_authenticated
|
||||
and user.is_contest_admin(self)
|
||||
or self.real_time_rank
|
||||
)
|
||||
return self.rule_type == ContestRuleType.ACM or \
|
||||
self.status == ContestStatus.CONTEST_ENDED or \
|
||||
user.is_authenticated and user.is_contest_admin(self) or \
|
||||
self.real_time_rank
|
||||
|
||||
class Meta:
|
||||
db_table = "contest"
|
||||
@@ -79,11 +76,6 @@ class ACMContestRank(AbstractContestRank):
|
||||
class Meta:
|
||||
db_table = "acm_contest_rank"
|
||||
unique_together = (("user", "contest"),)
|
||||
indexes = [
|
||||
models.Index(fields=["contest", "accepted_number", "total_time"],
|
||||
name="acm_rank_order_idx"),
|
||||
models.Index(fields=["contest", "user"], name="acm_rank_contest_user_idx"),
|
||||
]
|
||||
|
||||
|
||||
class OIContestRank(AbstractContestRank):
|
||||
@@ -95,10 +87,6 @@ class OIContestRank(AbstractContestRank):
|
||||
class Meta:
|
||||
db_table = "oi_contest_rank"
|
||||
unique_together = (("user", "contest"),)
|
||||
indexes = [
|
||||
models.Index(fields=["contest", "total_score"], name="oi_rank_order_idx"),
|
||||
models.Index(fields=["contest", "user"], name="oi_rank_contest_user_idx"),
|
||||
]
|
||||
|
||||
|
||||
class ContestAnnouncement(models.Model):
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
from utils.api import UsernameSerializer, serializers
|
||||
|
||||
from .models import ACMContestRank, Contest, ContestAnnouncement, ContestRuleType, OIContestRank
|
||||
from .models import Contest, ContestAnnouncement, ContestRuleType
|
||||
from .models import ACMContestRank, OIContestRank
|
||||
|
||||
|
||||
class CreateConetestSeriaizer(serializers.Serializer):
|
||||
|
||||
162
contest/tests.py
Normal file
162
contest/tests.py
Normal file
@@ -0,0 +1,162 @@
|
||||
import copy
|
||||
from datetime import datetime, timedelta
|
||||
|
||||
from django.utils import timezone
|
||||
|
||||
from utils.api.tests import APITestCase
|
||||
|
||||
from .models import ContestAnnouncement, ContestRuleType, Contest
|
||||
|
||||
DEFAULT_CONTEST_DATA = {"title": "test title", "description": "test description",
|
||||
"start_time": timezone.localtime(timezone.now()),
|
||||
"end_time": timezone.localtime(timezone.now()) + timedelta(days=1),
|
||||
"rule_type": ContestRuleType.ACM,
|
||||
"password": "123",
|
||||
"allowed_ip_ranges": [],
|
||||
"visible": True, "real_time_rank": True}
|
||||
|
||||
|
||||
class ContestAdminAPITest(APITestCase):
|
||||
def setUp(self):
|
||||
self.create_super_admin()
|
||||
self.url = self.reverse("contest_admin_api")
|
||||
self.data = copy.deepcopy(DEFAULT_CONTEST_DATA)
|
||||
|
||||
def test_create_contest(self):
|
||||
response = self.client.post(self.url, data=self.data)
|
||||
self.assertSuccess(response)
|
||||
return response
|
||||
|
||||
def test_create_contest_with_invalid_cidr(self):
|
||||
self.data["allowed_ip_ranges"] = ["127.0.0"]
|
||||
resp = self.client.post(self.url, data=self.data)
|
||||
self.assertTrue(resp.data["data"].endswith("is not a valid cidr network"))
|
||||
|
||||
def test_update_contest(self):
|
||||
id = self.test_create_contest().data["data"]["id"]
|
||||
update_data = {"id": id, "title": "update title",
|
||||
"description": "update description",
|
||||
"password": "12345",
|
||||
"visible": False, "real_time_rank": False}
|
||||
data = copy.deepcopy(self.data)
|
||||
data.update(update_data)
|
||||
response = self.client.put(self.url, data=data)
|
||||
self.assertSuccess(response)
|
||||
response_data = response.data["data"]
|
||||
for k in data.keys():
|
||||
if isinstance(data[k], datetime):
|
||||
continue
|
||||
self.assertEqual(response_data[k], data[k])
|
||||
|
||||
def test_get_contests(self):
|
||||
self.test_create_contest()
|
||||
response = self.client.get(self.url)
|
||||
self.assertSuccess(response)
|
||||
|
||||
def test_get_one_contest(self):
|
||||
id = self.test_create_contest().data["data"]["id"]
|
||||
response = self.client.get("{}?id={}".format(self.url, id))
|
||||
self.assertSuccess(response)
|
||||
|
||||
|
||||
class ContestAPITest(APITestCase):
|
||||
def setUp(self):
|
||||
user = self.create_admin()
|
||||
self.contest = Contest.objects.create(created_by=user, **DEFAULT_CONTEST_DATA)
|
||||
self.url = self.reverse("contest_api") + "?id=" + str(self.contest.id)
|
||||
|
||||
def test_get_contest_list(self):
|
||||
url = self.reverse("contest_list_api")
|
||||
response = self.client.get(url + "?limit=10")
|
||||
self.assertSuccess(response)
|
||||
self.assertEqual(len(response.data["data"]["results"]), 1)
|
||||
|
||||
def test_get_one_contest(self):
|
||||
resp = self.client.get(self.url)
|
||||
self.assertSuccess(resp)
|
||||
|
||||
def test_regular_user_validate_contest_password(self):
|
||||
self.create_user("test", "test123")
|
||||
url = self.reverse("contest_password_api")
|
||||
resp = self.client.post(url, {"contest_id": self.contest.id, "password": "error_password"})
|
||||
self.assertDictEqual(resp.data, {"error": "error", "data": "Wrong password or password expired"})
|
||||
|
||||
resp = self.client.post(url, {"contest_id": self.contest.id, "password": DEFAULT_CONTEST_DATA["password"]})
|
||||
self.assertSuccess(resp)
|
||||
|
||||
def test_regular_user_access_contest(self):
|
||||
self.create_user("test", "test123")
|
||||
url = self.reverse("contest_access_api")
|
||||
resp = self.client.get(url + "?contest_id=" + str(self.contest.id))
|
||||
self.assertFalse(resp.data["data"]["access"])
|
||||
|
||||
password_url = self.reverse("contest_password_api")
|
||||
resp = self.client.post(password_url,
|
||||
{"contest_id": self.contest.id, "password": DEFAULT_CONTEST_DATA["password"]})
|
||||
self.assertSuccess(resp)
|
||||
resp = self.client.get(self.url)
|
||||
self.assertSuccess(resp)
|
||||
|
||||
|
||||
class ContestAnnouncementAdminAPITest(APITestCase):
|
||||
def setUp(self):
|
||||
self.create_super_admin()
|
||||
self.url = self.reverse("contest_announcement_admin_api")
|
||||
contest_id = self.create_contest().data["data"]["id"]
|
||||
self.data = {"title": "test title", "content": "test content", "contest_id": contest_id, "visible": True}
|
||||
|
||||
def create_contest(self):
|
||||
url = self.reverse("contest_admin_api")
|
||||
data = DEFAULT_CONTEST_DATA
|
||||
return self.client.post(url, data=data)
|
||||
|
||||
def test_create_contest_announcement(self):
|
||||
response = self.client.post(self.url, data=self.data)
|
||||
self.assertSuccess(response)
|
||||
return response
|
||||
|
||||
def test_delete_contest_announcement(self):
|
||||
id = self.test_create_contest_announcement().data["data"]["id"]
|
||||
response = self.client.delete("{}?id={}".format(self.url, id))
|
||||
self.assertSuccess(response)
|
||||
self.assertFalse(ContestAnnouncement.objects.filter(id=id).exists())
|
||||
|
||||
def test_get_contest_announcements(self):
|
||||
self.test_create_contest_announcement()
|
||||
response = self.client.get(self.url + "?contest_id=" + str(self.data["contest_id"]))
|
||||
self.assertSuccess(response)
|
||||
|
||||
def test_get_one_contest_announcement(self):
|
||||
id = self.test_create_contest_announcement().data["data"]["id"]
|
||||
response = self.client.get("{}?id={}".format(self.url, id))
|
||||
self.assertSuccess(response)
|
||||
|
||||
|
||||
class ContestAnnouncementListAPITest(APITestCase):
|
||||
def setUp(self):
|
||||
self.create_super_admin()
|
||||
self.url = self.reverse("contest_announcement_api")
|
||||
|
||||
def create_contest_announcements(self):
|
||||
contest_id = self.client.post(self.reverse("contest_admin_api"), data=DEFAULT_CONTEST_DATA).data["data"]["id"]
|
||||
url = self.reverse("contest_announcement_admin_api")
|
||||
self.client.post(url, data={"title": "test title1", "content": "test content1", "contest_id": contest_id})
|
||||
self.client.post(url, data={"title": "test title2", "content": "test content2", "contest_id": contest_id})
|
||||
return contest_id
|
||||
|
||||
def test_get_contest_announcement_list(self):
|
||||
contest_id = self.create_contest_announcements()
|
||||
response = self.client.get(self.url, data={"contest_id": contest_id})
|
||||
self.assertSuccess(response)
|
||||
|
||||
|
||||
class ContestRankAPITest(APITestCase):
|
||||
def setUp(self):
|
||||
user = self.create_admin()
|
||||
self.acm_contest = Contest.objects.create(created_by=user, **DEFAULT_CONTEST_DATA)
|
||||
self.create_user("test", "test123")
|
||||
self.url = self.reverse("contest_rank_api")
|
||||
|
||||
def get_contest_rank(self):
|
||||
resp = self.client.get(self.url + "?contest_id=" + self.acm_contest.id)
|
||||
self.assertSuccess(resp)
|
||||
@@ -1,6 +1,6 @@
|
||||
from django.urls import path
|
||||
|
||||
from ..views.admin import ACMContestHelper, ContestAnnouncementAPI, ContestAPI, DownloadContestSubmissions
|
||||
from ..views.admin import ContestAnnouncementAPI, ContestAPI, ACMContestHelper, DownloadContestSubmissions
|
||||
|
||||
urlpatterns = [
|
||||
path("contest", ContestAPI.as_view()),
|
||||
|
||||
@@ -1,6 +1,9 @@
|
||||
from django.urls import path
|
||||
|
||||
from ..views.oj import ContestAccessAPI, ContestAnnouncementListAPI, ContestAPI, ContestListAPI, ContestPasswordVerifyAPI, ContestRankAPI
|
||||
from ..views.oj import ContestAnnouncementListAPI
|
||||
from ..views.oj import ContestPasswordVerifyAPI, ContestAccessAPI
|
||||
from ..views.oj import ContestListAPI, ContestAPI
|
||||
from ..views.oj import ContestRankAPI
|
||||
|
||||
urlpatterns = [
|
||||
path("contests", ContestListAPI.as_view()),
|
||||
|
||||
@@ -6,31 +6,23 @@ from ipaddress import ip_network
|
||||
import dateutil.parser
|
||||
from django.http import FileResponse
|
||||
|
||||
from account.decorators import super_admin_required
|
||||
from account.decorators import check_contest_permission, ensure_created_by
|
||||
from account.models import User
|
||||
from problem.models import Problem
|
||||
from submission.models import JudgeStatus, Submission
|
||||
from submission.models import Submission, JudgeStatus
|
||||
from utils.api import APIView, validate_serializer
|
||||
from utils.cache import cache
|
||||
from utils.constants import CacheKey
|
||||
from utils.shortcuts import rand_str
|
||||
from utils.tasks import delete_files
|
||||
|
||||
from ..models import ACMContestRank, Contest, ContestAnnouncement
|
||||
from ..serializers import (
|
||||
ACMContesHelperSerializer,
|
||||
ContestAdminSerializer,
|
||||
ContestAnnouncementSerializer,
|
||||
CreateConetestSeriaizer,
|
||||
CreateContestAnnouncementSerializer,
|
||||
EditConetestSeriaizer,
|
||||
EditContestAnnouncementSerializer,
|
||||
)
|
||||
from ..models import Contest, ContestAnnouncement, ACMContestRank
|
||||
from ..serializers import (ContestAnnouncementSerializer, ContestAdminSerializer,
|
||||
CreateConetestSeriaizer, CreateContestAnnouncementSerializer,
|
||||
EditConetestSeriaizer, EditContestAnnouncementSerializer,
|
||||
ACMContesHelperSerializer, )
|
||||
|
||||
|
||||
class ContestAPI(APIView):
|
||||
@validate_serializer(CreateConetestSeriaizer)
|
||||
@super_admin_required
|
||||
def post(self, request):
|
||||
data = request.data
|
||||
data["start_time"] = dateutil.parser.parse(data["start_time"])
|
||||
@@ -49,11 +41,11 @@ class ContestAPI(APIView):
|
||||
return self.success(ContestAdminSerializer(contest).data)
|
||||
|
||||
@validate_serializer(EditConetestSeriaizer)
|
||||
@super_admin_required
|
||||
def put(self, request):
|
||||
data = request.data
|
||||
try:
|
||||
contest = Contest.objects.get(id=data.pop("id"))
|
||||
ensure_created_by(contest, request.user)
|
||||
except Contest.DoesNotExist:
|
||||
return self.error("Contest does not exist")
|
||||
data["start_time"] = dateutil.parser.parse(data["start_time"])
|
||||
@@ -76,29 +68,28 @@ class ContestAPI(APIView):
|
||||
contest.save()
|
||||
return self.success(ContestAdminSerializer(contest).data)
|
||||
|
||||
@super_admin_required
|
||||
def get(self, request):
|
||||
contest_id = request.GET.get("id")
|
||||
if contest_id:
|
||||
try:
|
||||
contest = Contest.objects.get(id=contest_id)
|
||||
ensure_created_by(contest, request.user)
|
||||
return self.success(ContestAdminSerializer(contest).data)
|
||||
except Contest.DoesNotExist:
|
||||
return self.error("Contest does not exist")
|
||||
|
||||
contests = Contest.objects.all().order_by("-create_time")
|
||||
if request.user.is_admin():
|
||||
contests = contests.filter(created_by=request.user)
|
||||
|
||||
keyword = request.GET.get("keyword")
|
||||
if keyword:
|
||||
contests = contests.filter(title__contains=keyword)
|
||||
return self.success(
|
||||
self.paginate_data(request, contests, ContestAdminSerializer)
|
||||
)
|
||||
return self.success(self.paginate_data(request, contests, ContestAdminSerializer))
|
||||
|
||||
|
||||
class ContestAnnouncementAPI(APIView):
|
||||
@validate_serializer(CreateContestAnnouncementSerializer)
|
||||
@super_admin_required
|
||||
def post(self, request):
|
||||
"""
|
||||
Create one contest_announcement.
|
||||
@@ -106,6 +97,7 @@ class ContestAnnouncementAPI(APIView):
|
||||
data = request.data
|
||||
try:
|
||||
contest = Contest.objects.get(id=data.pop("contest_id"))
|
||||
ensure_created_by(contest, request.user)
|
||||
data["contest"] = contest
|
||||
data["created_by"] = request.user
|
||||
except Contest.DoesNotExist:
|
||||
@@ -114,7 +106,6 @@ class ContestAnnouncementAPI(APIView):
|
||||
return self.success(ContestAnnouncementSerializer(announcement).data)
|
||||
|
||||
@validate_serializer(EditContestAnnouncementSerializer)
|
||||
@super_admin_required
|
||||
def put(self, request):
|
||||
"""
|
||||
update contest_announcement
|
||||
@@ -122,6 +113,7 @@ class ContestAnnouncementAPI(APIView):
|
||||
data = request.data
|
||||
try:
|
||||
contest_announcement = ContestAnnouncement.objects.get(id=data.pop("id"))
|
||||
ensure_created_by(contest_announcement, request.user)
|
||||
except ContestAnnouncement.DoesNotExist:
|
||||
return self.error("Contest announcement does not exist")
|
||||
for k, v in data.items():
|
||||
@@ -129,17 +121,19 @@ class ContestAnnouncementAPI(APIView):
|
||||
contest_announcement.save()
|
||||
return self.success()
|
||||
|
||||
@super_admin_required
|
||||
def delete(self, request):
|
||||
"""
|
||||
Delete one contest_announcement.
|
||||
"""
|
||||
contest_announcement_id = request.GET.get("id")
|
||||
if contest_announcement_id:
|
||||
ContestAnnouncement.objects.filter(id=contest_announcement_id).delete()
|
||||
if request.user.is_admin():
|
||||
ContestAnnouncement.objects.filter(id=contest_announcement_id,
|
||||
contest__created_by=request.user).delete()
|
||||
else:
|
||||
ContestAnnouncement.objects.filter(id=contest_announcement_id).delete()
|
||||
return self.success()
|
||||
|
||||
@super_admin_required
|
||||
def get(self, request):
|
||||
"""
|
||||
Get one contest_announcement or contest_announcement list.
|
||||
@@ -147,71 +141,45 @@ class ContestAnnouncementAPI(APIView):
|
||||
contest_announcement_id = request.GET.get("id")
|
||||
if contest_announcement_id:
|
||||
try:
|
||||
contest_announcement = ContestAnnouncement.objects.get(
|
||||
id=contest_announcement_id
|
||||
)
|
||||
return self.success(
|
||||
ContestAnnouncementSerializer(contest_announcement).data
|
||||
)
|
||||
contest_announcement = ContestAnnouncement.objects.get(id=contest_announcement_id)
|
||||
ensure_created_by(contest_announcement, request.user)
|
||||
return self.success(ContestAnnouncementSerializer(contest_announcement).data)
|
||||
except ContestAnnouncement.DoesNotExist:
|
||||
return self.error("Contest announcement does not exist")
|
||||
|
||||
contest_id = request.GET.get("contest_id")
|
||||
if not contest_id:
|
||||
return self.error("Parameter error")
|
||||
contest_announcements = ContestAnnouncement.objects.filter(
|
||||
contest_id=contest_id
|
||||
)
|
||||
contest_announcements = ContestAnnouncement.objects.filter(contest_id=contest_id)
|
||||
if request.user.is_admin():
|
||||
contest_announcements = contest_announcements.filter(created_by=request.user)
|
||||
keyword = request.GET.get("keyword")
|
||||
if keyword:
|
||||
contest_announcements = contest_announcements.filter(
|
||||
title__contains=keyword
|
||||
)
|
||||
return self.success(
|
||||
ContestAnnouncementSerializer(contest_announcements, many=True).data
|
||||
)
|
||||
contest_announcements = contest_announcements.filter(title__contains=keyword)
|
||||
return self.success(ContestAnnouncementSerializer(contest_announcements, many=True).data)
|
||||
|
||||
|
||||
class ACMContestHelper(APIView):
|
||||
@super_admin_required
|
||||
@check_contest_permission(check_type="ranks")
|
||||
def get(self, request):
|
||||
contest_id = request.GET.get("contest_id")
|
||||
if not contest_id:
|
||||
return self.error("Parameter error, contest_id is required")
|
||||
try:
|
||||
contest = Contest.objects.get(id=contest_id, visible=True)
|
||||
except Contest.DoesNotExist:
|
||||
return self.error("Contest does not exist")
|
||||
|
||||
problems = Problem.objects.filter(contest=contest).values("id", "_id")
|
||||
problem_id_map = {str(p["id"]): p["_id"] for p in problems}
|
||||
|
||||
ranks = ACMContestRank.objects.filter(
|
||||
contest=contest, accepted_number__gt=0
|
||||
).values(
|
||||
"id", "user__username", "user__userprofile__real_name", "submission_info"
|
||||
)
|
||||
ranks = ACMContestRank.objects.filter(contest=self.contest, accepted_number__gt=0) \
|
||||
.values("id", "user__username", "user__userprofile__real_name", "submission_info")
|
||||
results = []
|
||||
for rank in ranks:
|
||||
for problem_id, info in rank["submission_info"].items():
|
||||
if info["is_ac"]:
|
||||
results.append(
|
||||
{
|
||||
"id": rank["id"],
|
||||
"username": rank["user__username"],
|
||||
"real_name": rank["user__userprofile__real_name"],
|
||||
"problem_id": problem_id,
|
||||
"problem_display_id": problem_id_map.get(
|
||||
problem_id, problem_id
|
||||
),
|
||||
"ac_info": info,
|
||||
"checked": info.get("checked", False),
|
||||
}
|
||||
)
|
||||
results.append({
|
||||
"id": rank["id"],
|
||||
"username": rank["user__username"],
|
||||
"real_name": rank["user__userprofile__real_name"],
|
||||
"problem_id": problem_id,
|
||||
"ac_info": info,
|
||||
"checked": info.get("checked", False)
|
||||
})
|
||||
results.sort(key=lambda x: -x["ac_info"]["ac_time"])
|
||||
return self.success(results)
|
||||
|
||||
@super_admin_required
|
||||
@check_contest_permission(check_type="ranks")
|
||||
@validate_serializer(ACMContesHelperSerializer)
|
||||
def put(self, request):
|
||||
data = request.data
|
||||
@@ -232,9 +200,7 @@ class DownloadContestSubmissions(APIView):
|
||||
problem_ids = contest.problem_set.all().values_list("id", "_id")
|
||||
id2display_id = {k[0]: k[1] for k in problem_ids}
|
||||
ac_map = {k[0]: False for k in problem_ids}
|
||||
submissions = Submission.objects.filter(
|
||||
contest=contest, result=JudgeStatus.ACCEPTED
|
||||
).order_by("-create_time")
|
||||
submissions = Submission.objects.filter(contest=contest, result=JudgeStatus.ACCEPTED).order_by("-create_time")
|
||||
user_ids = submissions.values_list("user_id", flat=True)
|
||||
users = User.objects.filter(id__in=user_ids)
|
||||
path = f"/tmp/{rand_str()}.zip"
|
||||
@@ -248,25 +214,21 @@ class DownloadContestSubmissions(APIView):
|
||||
problem_id = submission.problem_id
|
||||
if user_ac_map[problem_id]:
|
||||
continue
|
||||
file_name = (
|
||||
f"{user.username}_{id2display_id[submission.problem_id]}.txt"
|
||||
)
|
||||
file_name = f"{user.username}_{id2display_id[submission.problem_id]}.txt"
|
||||
compression = zipfile.ZIP_DEFLATED
|
||||
zip_file.writestr(
|
||||
zinfo_or_arcname=f"{file_name}",
|
||||
data=submission.code,
|
||||
compress_type=compression,
|
||||
)
|
||||
zip_file.writestr(zinfo_or_arcname=f"{file_name}",
|
||||
data=submission.code,
|
||||
compress_type=compression)
|
||||
user_ac_map[problem_id] = True
|
||||
return path
|
||||
|
||||
@super_admin_required
|
||||
def get(self, request):
|
||||
contest_id = request.GET.get("contest_id")
|
||||
if not contest_id:
|
||||
return self.error("Parameter error")
|
||||
try:
|
||||
contest = Contest.objects.get(id=contest_id)
|
||||
ensure_created_by(contest, request.user)
|
||||
except Contest.DoesNotExist:
|
||||
return self.error("Contest does not exist")
|
||||
|
||||
@@ -275,7 +237,5 @@ class DownloadContestSubmissions(APIView):
|
||||
delete_files.send_with_options(args=(zip_path,), delay=300_000)
|
||||
resp = FileResponse(open(zip_path, "rb"))
|
||||
resp["Content-Type"] = "application/zip"
|
||||
resp["Content-Disposition"] = (
|
||||
f"attachment;filename={os.path.basename(zip_path)}"
|
||||
)
|
||||
resp["Content-Disposition"] = f"attachment;filename={os.path.basename(zip_path)}"
|
||||
return resp
|
||||
|
||||
@@ -1,23 +1,26 @@
|
||||
import io
|
||||
|
||||
import xlsxwriter
|
||||
from django.core.cache import cache
|
||||
from django.http import HttpResponse
|
||||
from django.utils.timezone import now
|
||||
from django.core.cache import cache
|
||||
|
||||
from account.decorators import (
|
||||
check_contest_password,
|
||||
check_contest_permission,
|
||||
login_required,
|
||||
)
|
||||
from account.models import AdminType
|
||||
from problem.models import Problem
|
||||
from utils.api import APIView, validate_serializer
|
||||
from utils.constants import CONTEST_PASSWORD_SESSION_KEY, CacheKey, ContestRuleType, ContestStatus
|
||||
from utils.shortcuts import check_is_id, datetime2str
|
||||
from utils.constants import CacheKey, CONTEST_PASSWORD_SESSION_KEY
|
||||
from utils.shortcuts import datetime2str, check_is_id
|
||||
from account.models import AdminType
|
||||
from account.decorators import (
|
||||
login_required,
|
||||
check_contest_permission,
|
||||
check_contest_password,
|
||||
)
|
||||
|
||||
from ..models import ACMContestRank, Contest, ContestAnnouncement, OIContestRank
|
||||
from ..serializers import ACMContestRankSerializer, ContestAnnouncementSerializer, ContestPasswordVerifySerializer, ContestSerializer, OIContestRankSerializer
|
||||
from utils.constants import ContestRuleType, ContestStatus
|
||||
from ..models import ContestAnnouncement, Contest, OIContestRank, ACMContestRank
|
||||
from ..serializers import ContestAnnouncementSerializer
|
||||
from ..serializers import ContestSerializer, ContestPasswordVerifySerializer
|
||||
from ..serializers import OIContestRankSerializer, ACMContestRankSerializer
|
||||
|
||||
|
||||
class ContestAnnouncementListAPI(APIView):
|
||||
@@ -166,16 +169,15 @@ class ContestRankAPI(APIView):
|
||||
cache_key = f"{CacheKey.contest_rank_cache}:{self.contest.id}"
|
||||
qs = cache.get(cache_key)
|
||||
if not qs:
|
||||
qs = list(self.get_rank())
|
||||
qs = self.get_rank()
|
||||
cache.set(cache_key, qs)
|
||||
|
||||
if download_csv:
|
||||
data = serializer(qs, many=True, is_contest_admin=is_contest_admin).data
|
||||
contest_problems = list(Problem.objects.filter(
|
||||
contest_problems = Problem.objects.filter(
|
||||
contest=self.contest, visible=True
|
||||
).order_by("_id"))
|
||||
# 预建 problem_id → 列索引 的字典,避免循环中 O(n) list.index()
|
||||
problem_id_to_col = {p.id: i for i, p in enumerate(contest_problems)}
|
||||
).order_by("_id")
|
||||
problem_ids = [item.id for item in contest_problems]
|
||||
|
||||
f = io.BytesIO()
|
||||
workbook = xlsxwriter.Workbook(f)
|
||||
@@ -185,8 +187,11 @@ class ContestRankAPI(APIView):
|
||||
worksheet.write("C1", "Real Name")
|
||||
if self.contest.rule_type == ContestRuleType.OI:
|
||||
worksheet.write("D1", "Total Score")
|
||||
for i, p in enumerate(contest_problems):
|
||||
worksheet.write(self.column_string(5 + i) + "1", p.title)
|
||||
for item in range(contest_problems.count()):
|
||||
worksheet.write(
|
||||
self.column_string(5 + item) + "1",
|
||||
f"{contest_problems[item].title}",
|
||||
)
|
||||
for index, item in enumerate(data):
|
||||
worksheet.write_string(index + 1, 0, str(item["user"]["id"]))
|
||||
worksheet.write_string(index + 1, 1, item["user"]["username"])
|
||||
@@ -196,14 +201,17 @@ class ContestRankAPI(APIView):
|
||||
worksheet.write_string(index + 1, 3, str(item["total_score"]))
|
||||
for k, v in item["submission_info"].items():
|
||||
worksheet.write_string(
|
||||
index + 1, 4 + problem_id_to_col[int(k)], str(v)
|
||||
index + 1, 4 + problem_ids.index(int(k)), str(v)
|
||||
)
|
||||
else:
|
||||
worksheet.write("D1", "AC")
|
||||
worksheet.write("E1", "Total Submission")
|
||||
worksheet.write("F1", "Total Time")
|
||||
for i, p in enumerate(contest_problems):
|
||||
worksheet.write(self.column_string(7 + i) + "1", p.title)
|
||||
for item in range(contest_problems.count()):
|
||||
worksheet.write(
|
||||
self.column_string(7 + item) + "1",
|
||||
f"{contest_problems[item].title}",
|
||||
)
|
||||
|
||||
for index, item in enumerate(data):
|
||||
worksheet.write_string(index + 1, 0, str(item["user"]["id"]))
|
||||
@@ -216,7 +224,7 @@ class ContestRankAPI(APIView):
|
||||
worksheet.write_string(index + 1, 5, str(item["total_time"]))
|
||||
for k, v in item["submission_info"].items():
|
||||
worksheet.write_string(
|
||||
index + 1, 6 + problem_id_to_col[int(k)], str(v["is_ac"])
|
||||
index + 1, 6 + problem_ids.index(int(k)), str(v["is_ac"])
|
||||
)
|
||||
|
||||
workbook.close()
|
||||
|
||||
@@ -2,23 +2,6 @@ location /public {
|
||||
root /data;
|
||||
}
|
||||
|
||||
# WebSocket 支持
|
||||
location /ws/ {
|
||||
proxy_pass http://websocket;
|
||||
proxy_http_version 1.1;
|
||||
proxy_set_header Upgrade $http_upgrade;
|
||||
proxy_set_header Connection "upgrade";
|
||||
proxy_set_header Host $http_host;
|
||||
proxy_set_header X-Real-IP __IP_HEADER__;
|
||||
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||
proxy_set_header X-Forwarded-Proto $scheme;
|
||||
|
||||
# WebSocket 超时设置
|
||||
proxy_connect_timeout 7d;
|
||||
proxy_send_timeout 7d;
|
||||
proxy_read_timeout 7d;
|
||||
}
|
||||
|
||||
location /api {
|
||||
include api_proxy.conf;
|
||||
}
|
||||
|
||||
@@ -38,11 +38,6 @@ http {
|
||||
keepalive 32;
|
||||
}
|
||||
|
||||
upstream websocket {
|
||||
server 127.0.0.1:8001;
|
||||
keepalive 32;
|
||||
}
|
||||
|
||||
add_header X-XSS-Protection "1; mode=block" always;
|
||||
add_header X-Frame-Options SAMEORIGIN always;
|
||||
add_header X-Content-Type-Options nosniff always;
|
||||
@@ -51,7 +46,7 @@ http {
|
||||
listen 8000 default_server;
|
||||
server_name _;
|
||||
|
||||
include locations.conf;
|
||||
include http_locations.conf;
|
||||
}
|
||||
|
||||
# server {
|
||||
|
||||
@@ -1,924 +1,31 @@
|
||||
# This file was autogenerated by uv via the following command:
|
||||
# uv export --format requirements.txt
|
||||
annotated-types==0.7.0 \
|
||||
--hash=sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53 \
|
||||
--hash=sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89
|
||||
# via pydantic
|
||||
anyio==4.13.0 \
|
||||
--hash=sha256:08b310f9e24a9594186fd75b4f73f4a4152069e3853f1ed8bfbf58369f4ad708 \
|
||||
--hash=sha256:334b70e641fd2221c1505b3890c69882fe4a2df910cba14d97019b90b24439dc
|
||||
# via
|
||||
# httpx
|
||||
# openai
|
||||
asgiref==3.11.1 \
|
||||
--hash=sha256:5f184dc43b7e763efe848065441eac62229c9f7b0475f41f80e207a114eda4ce \
|
||||
--hash=sha256:e8667a091e69529631969fd45dc268fa79b99c92c5fcdda727757e52146ec133
|
||||
# via
|
||||
# channels
|
||||
# channels-redis
|
||||
# daphne
|
||||
# django
|
||||
attrs==26.1.0 \
|
||||
--hash=sha256:c647aa4a12dfbad9333ca4e71fe62ddc36f4e63b2d260a37a8b83d2f043ac309 \
|
||||
--hash=sha256:d03ceb89cb322a8fd706d4fb91940737b6642aa36998fe130a9bc96c985eff32
|
||||
# via
|
||||
# service-identity
|
||||
# twisted
|
||||
autobahn==25.12.2 \
|
||||
--hash=sha256:18b12e8af7fc115487715afa10b3f5b5a4b5989bebbe05b71722cf9fce7b1bfb \
|
||||
--hash=sha256:220748f21e91bd4a538d2d3de640cc17ee30b79f1c04a6c3dcdef321d531ee1c \
|
||||
--hash=sha256:754c06a54753aeb7e8d10c5cbf03249ad9e2a1a32bca8be02865c6f00628a98c \
|
||||
--hash=sha256:9abda5cf817c0f8a19a55a67a031adf2fc70ed351719b5bd9e6fa0f5f4bc8f89 \
|
||||
--hash=sha256:b5297a782fc7d0a26842438ef1342549ceee29496cda52672ac44635c79eeb94 \
|
||||
--hash=sha256:ba1867aafdbe585d3d4a5abd35238a78ab54ab3de5bd12a21bca20379c9f512b \
|
||||
--hash=sha256:bc17f6cab9438156d2701c293c76fd02a144f9be0a992c065dfee1935ce4845b \
|
||||
--hash=sha256:c0c3f1d5dafda52f8dc962ab583b6f3473b7b7186cab082d05372ed43a8261a5 \
|
||||
--hash=sha256:c840ee136bfaf6560467160129b0b25a0e33c9a51e2b251e98c5474f27583915 \
|
||||
--hash=sha256:e9e2a962f2de0bc4c53b452916458417a15f5137c956245ac6d0a783a83fa1f7
|
||||
# via daphne
|
||||
automat==25.4.16 \
|
||||
--hash=sha256:0017591a5477066e90d26b0e696ddc143baafd87b588cfac8100bc6be9634de0 \
|
||||
--hash=sha256:04e9bce696a8d5671ee698005af6e5a9fa15354140a87f4870744604dcdd3ba1
|
||||
# via twisted
|
||||
cbor2==6.0.1 \
|
||||
--hash=sha256:067d23ac75bfa35bed0e795169139259dc9d9bae503c8ede29740f99b37415f3 \
|
||||
--hash=sha256:10f0376763ce8913c1a5b9f21c51ca55848ed16795bd2b80860d56ed944374ab \
|
||||
--hash=sha256:1eedb7bda2a528149ff95345e383c2f97104800debc9ef6f0cd693b46b0df4ff \
|
||||
--hash=sha256:3e8eaee64cd09d67a413e1fc758750e9e9c15cdb677a725163da834b981552ec \
|
||||
--hash=sha256:46a745c296ec336fe83fa7905b77b4faa243eb32bb84fab1cfdb0e4636d1985b \
|
||||
--hash=sha256:4d324878156075778da61f9d4a09e6c4306493964f24f8fd92b43d97e99eac10 \
|
||||
--hash=sha256:50ebae27b72061c8baf3cd8458c3eb2de7c112d0be77af24e8c4206a2b0e7b61 \
|
||||
--hash=sha256:5df6d0cd72c62dfb300facd6ccb982214fe3376b69f393d0d271e4436fd7b624 \
|
||||
--hash=sha256:65f0dc88cbd2cc252c31212b0bac3d10ae8e94db5e476a662022593cdd3cc56a \
|
||||
--hash=sha256:67aa9514b08163de9c180d2a2bcf3f3a050d2a2ef9ca9bb8cc8b3a7bd4e6599d \
|
||||
--hash=sha256:6e8fca9f1860e81e7b78af9d5686380143a2474d6bf4dcae348219cd34013436 \
|
||||
--hash=sha256:778746168f80403dcb5e0e85a16076967652aef74bf2d13f53ce3d150e9b8be7 \
|
||||
--hash=sha256:77cf35c614be31c5e8be761328b57ef6aaf43a78301e7df10faa7a8c626d6910 \
|
||||
--hash=sha256:7d936d14307311d0284f7d448fab47a4d1e279305005ffa733411eb81e0b7d81 \
|
||||
--hash=sha256:80765e22c387fb489102ed751f5706fc184c9cdb34257df3dab4d393564b00e6 \
|
||||
--hash=sha256:82802f05ae595cfe451ab6a15948b20445a411fb83ef8568591577f6b91313aa \
|
||||
--hash=sha256:83d2b27908f8697041cee46af54ab684e9dd6e9710d70d31dc50e89cc908433d \
|
||||
--hash=sha256:897f6fe58d1522608b6b71a7aa964f31c40deed5fff2d00511233bacb396dded \
|
||||
--hash=sha256:943e3824c51312f747b0b164fc4ae96c191eae40685e049b28c747158a8613d0 \
|
||||
--hash=sha256:a4413d99d398858603be036016b59d21c1e6c3a4bb9d12fb9ccf4f8509afde05 \
|
||||
--hash=sha256:b8a3cf4a95b219eb10d72e31b6919f47a4928506ae95001e4384531bec5f787d \
|
||||
--hash=sha256:c1cfab10d65989cd79c203a00b5460feb6f34c519714779a77ccfb772704ff4b \
|
||||
--hash=sha256:c6fcf7f406a5e5cda5e993d4dbd064b0cb22e84c9800966e2358a9172b3d4684 \
|
||||
--hash=sha256:d177965364ae29b7d8854a0d38f41e2aa3ef2a440a8fd28550413ea649715eb5 \
|
||||
--hash=sha256:df1e47d7dfb335ee82cd6593db111e6ca12d2c370a08a94d3622b4c08fda3b69 \
|
||||
--hash=sha256:ead214c6d4b4a6b20213c3a4a0e93a565acedbaa367f793cf5bf19936365fa46 \
|
||||
--hash=sha256:f173a5d6a686006c9edaeee5aab1356be2cba86c3af15b592e5cf8749831dcaf \
|
||||
--hash=sha256:f390b24279229499c93f2ba40031fb9dd03cd2fc0d1ae757116013398bb25bc4 \
|
||||
--hash=sha256:fd7f89d53aea0e7d12a08fc8366a5d7d532d7bdf253b042d1e4fd33398ca6f17
|
||||
# via autobahn
|
||||
certifi==2026.4.22 \
|
||||
--hash=sha256:3cb2210c8f88ba2318d29b0388d1023c8492ff72ecdde4ebdaddbb13a31b1c4a \
|
||||
--hash=sha256:8d455352a37b71bf76a79caa83a3d6c25afee4a385d632127b6afb3963f1c580
|
||||
# via
|
||||
# httpcore
|
||||
# httpx
|
||||
# requests
|
||||
# sentry-sdk
|
||||
cffi==2.0.0 \
|
||||
--hash=sha256:00bdf7acc5f795150faa6957054fbbca2439db2f775ce831222b66f192f03beb \
|
||||
--hash=sha256:07b271772c100085dd28b74fa0cd81c8fb1a3ba18b21e03d7c27f3436a10606b \
|
||||
--hash=sha256:087067fa8953339c723661eda6b54bc98c5625757ea62e95eb4898ad5e776e9f \
|
||||
--hash=sha256:0a1527a803f0a659de1af2e1fd700213caba79377e27e4693648c2923da066f9 \
|
||||
--hash=sha256:12873ca6cb9b0f0d3a0da705d6086fe911591737a59f28b7936bdfed27c0d47c \
|
||||
--hash=sha256:19f705ada2530c1167abacb171925dd886168931e0a7b78f5bffcae5c6b5be75 \
|
||||
--hash=sha256:1e3a615586f05fc4065a8b22b8152f0c1b00cdbc60596d187c2a74f9e3036e4e \
|
||||
--hash=sha256:1fc9ea04857caf665289b7a75923f2c6ed559b8298a1b8c49e59f7dd95c8481e \
|
||||
--hash=sha256:203a48d1fb583fc7d78a4c6655692963b860a417c0528492a6bc21f1aaefab25 \
|
||||
--hash=sha256:21d1152871b019407d8ac3985f6775c079416c282e431a4da6afe7aefd2bccbe \
|
||||
--hash=sha256:24b6f81f1983e6df8db3adc38562c83f7d4a0c36162885ec7f7b77c7dcbec97b \
|
||||
--hash=sha256:256f80b80ca3853f90c21b23ee78cd008713787b1b1e93eae9f3d6a7134abd91 \
|
||||
--hash=sha256:28a3a209b96630bca57cce802da70c266eb08c6e97e5afd61a75611ee6c64592 \
|
||||
--hash=sha256:2c8f814d84194c9ea681642fd164267891702542f028a15fc97d4674b6206187 \
|
||||
--hash=sha256:38100abb9d1b1435bc4cc340bb4489635dc2f0da7456590877030c9b3d40b0c1 \
|
||||
--hash=sha256:3925dd22fa2b7699ed2617149842d2e6adde22b262fcbfada50e3d195e4b3a94 \
|
||||
--hash=sha256:3e17ed538242334bf70832644a32a7aae3d83b57567f9fd60a26257e992b79ba \
|
||||
--hash=sha256:44d1b5909021139fe36001ae048dbdde8214afa20200eda0f64c068cac5d5529 \
|
||||
--hash=sha256:45d5e886156860dc35862657e1494b9bae8dfa63bf56796f2fb56e1679fc0bca \
|
||||
--hash=sha256:4671d9dd5ec934cb9a73e7ee9676f9362aba54f7f34910956b84d727b0d73fb6 \
|
||||
--hash=sha256:6c6c373cfc5c83a975506110d17457138c8c63016b563cc9ed6e056a82f13ce4 \
|
||||
--hash=sha256:6d02d6655b0e54f54c4ef0b94eb6be0607b70853c45ce98bd278dc7de718be5d \
|
||||
--hash=sha256:6d50360be4546678fc1b79ffe7a66265e28667840010348dd69a314145807a1b \
|
||||
--hash=sha256:737fe7d37e1a1bffe70bd5754ea763a62a066dc5913ca57e957824b72a85e205 \
|
||||
--hash=sha256:74a03b9698e198d47562765773b4a8309919089150a0bb17d829ad7b44b60d27 \
|
||||
--hash=sha256:7553fb2090d71822f02c629afe6042c299edf91ba1bf94951165613553984512 \
|
||||
--hash=sha256:7a66c7204d8869299919db4d5069a82f1561581af12b11b3c9f48c584eb8743d \
|
||||
--hash=sha256:7cc09976e8b56f8cebd752f7113ad07752461f48a58cbba644139015ac24954c \
|
||||
--hash=sha256:81afed14892743bbe14dacb9e36d9e0e504cd204e0b165062c488942b9718037 \
|
||||
--hash=sha256:8eca2a813c1cb7ad4fb74d368c2ffbbb4789d377ee5bb8df98373c2cc0dee76c \
|
||||
--hash=sha256:92b68146a71df78564e4ef48af17551a5ddd142e5190cdf2c5624d0c3ff5b2e8 \
|
||||
--hash=sha256:9a67fc9e8eb39039280526379fb3a70023d77caec1852002b4da7e8b270c4dd9 \
|
||||
--hash=sha256:afb8db5439b81cf9c9d0c80404b60c3cc9c3add93e114dcae767f1477cb53775 \
|
||||
--hash=sha256:b1e74d11748e7e98e2f426ab176d4ed720a64412b6a15054378afdb71e0f37dc \
|
||||
--hash=sha256:b21e08af67b8a103c71a250401c78d5e0893beff75e28c53c98f4de42f774062 \
|
||||
--hash=sha256:c654de545946e0db659b3400168c9ad31b5d29593291482c43e3564effbcee13 \
|
||||
--hash=sha256:c8d3b5532fc71b7a77c09192b4a5a200ea992702734a2e9279a37f2478236f26 \
|
||||
--hash=sha256:d48a880098c96020b02d5a1f7d9251308510ce8858940e6fa99ece33f610838b \
|
||||
--hash=sha256:d68b6cef7827e8641e8ef16f4494edda8b36104d79773a334beaa1e3521430f6 \
|
||||
--hash=sha256:d9b29c1f0ae438d5ee9acb31cadee00a58c46cc9c0b2f9038c6b0b3470877a8c \
|
||||
--hash=sha256:d9b97165e8aed9272a6bb17c01e3cc5871a594a446ebedc996e2397a1c1ea8ef \
|
||||
--hash=sha256:da68248800ad6320861f129cd9c1bf96ca849a2771a59e0344e88681905916f5 \
|
||||
--hash=sha256:da902562c3e9c550df360bfa53c035b2f241fed6d9aef119048073680ace4a18 \
|
||||
--hash=sha256:dbd5c7a25a7cb98f5ca55d258b103a2054f859a46ae11aaf23134f9cc0d356ad \
|
||||
--hash=sha256:dd4f05f54a52fb558f1ba9f528228066954fee3ebe629fc1660d874d040ae5a3 \
|
||||
--hash=sha256:f93fd8e5c8c0a4aa1f424d6173f14a892044054871c771f8566e4008eaa359d2 \
|
||||
--hash=sha256:fc33c5141b55ed366cfaad382df24fe7dcbc686de5be719b207bb248e3053dc5
|
||||
# via
|
||||
# autobahn
|
||||
# cryptography
|
||||
channels==4.3.2 \
|
||||
--hash=sha256:f2bb6bfb73ad7fb4705041d07613c7b4e69528f01ef8cb9fb6c21d9295f15667 \
|
||||
--hash=sha256:fef47e9055a603900cf16cef85f050d522d9ac4b3daccf24835bd9580705c176
|
||||
# via
|
||||
# channels-redis
|
||||
# onlinejudge
|
||||
channels-redis==4.3.0 \
|
||||
--hash=sha256:48f3e902ae2d5fef7080215524f3b4a1d3cea4e304150678f867a1a822c0d9f5 \
|
||||
--hash=sha256:740ee7b54f0e28cf2264a940a24453d3f00526a96931f911fcb69228ef245dd2
|
||||
# via onlinejudge
|
||||
charset-normalizer==3.4.7 \
|
||||
--hash=sha256:03853ed82eeebbce3c2abfdbc98c96dc205f32a79627688ac9a27370ea61a49c \
|
||||
--hash=sha256:0c96c3b819b5c3e9e165495db84d41914d6894d55181d2d108cc1a69bfc9cce0 \
|
||||
--hash=sha256:0ea948db76d31190bf08bd371623927ee1339d5f2a0b4b1b4a4439a65298703c \
|
||||
--hash=sha256:0f7eb884681e3938906ed0434f20c63046eacd0111c4ba96f27b76084cd679f5 \
|
||||
--hash=sha256:1c2aed2e5e41f24ea8ef1590b8e848a79b56f3a5564a65ceec43c9d692dc7d8a \
|
||||
--hash=sha256:203104ed3e428044fd943bc4bf45fa73c0730391f9621e37fe39ecf477b128cb \
|
||||
--hash=sha256:2257141f39fe65a3fdf38aeccae4b953e5f3b3324f4ff0daf9f15b8518666a2c \
|
||||
--hash=sha256:298930cec56029e05497a76988377cbd7457ba864beeea92ad7e844fe74cd1f1 \
|
||||
--hash=sha256:2d6eb928e13016cea4f1f21d1e10c1cebd5a421bc57ddf5b1142ae3f86824fab \
|
||||
--hash=sha256:3534e7dcbdcf757da6b85a0bbf5b6868786d5982dd959b065e65481644817a18 \
|
||||
--hash=sha256:3946fa46a0cf3e4c8cb1cc52f56bb536310d34f25f01ca9b6c16afa767dab110 \
|
||||
--hash=sha256:3bec022aec2c514d9cf199522a802bd007cd588ab17ab2525f20f9c34d067c18 \
|
||||
--hash=sha256:3c9a494bc5ec77d43cea229c4f6db1e4d8fe7e1bbffa8b6f0f0032430ff8ab44 \
|
||||
--hash=sha256:3dce51d0f5e7951f8bb4900c257dad282f49190fdbebecd4ba99bcc41fef404d \
|
||||
--hash=sha256:3dedcc22d73ec993f42055eff4fcfed9318d1eeb9a6606c55892a26964964e48 \
|
||||
--hash=sha256:4042d5c8f957e15221d423ba781e85d553722fc4113f523f2feb7b188cc34c5e \
|
||||
--hash=sha256:481551899c856c704d58119b5025793fa6730adda3571971af568f66d2424bb5 \
|
||||
--hash=sha256:4dc1e73c36828f982bfe79fadf5919923f8a6f4df2860804db9a98c48824ce8d \
|
||||
--hash=sha256:54523e136b8948060c0fa0bc7b1b50c32c186f2fceee897a495406bb6e311d2b \
|
||||
--hash=sha256:5649fd1c7bade02f320a462fdefd0b4bd3ce036065836d4f42e0de958038e116 \
|
||||
--hash=sha256:56be790f86bfb2c98fb742ce566dfb4816e5a83384616ab59c49e0604d49c51d \
|
||||
--hash=sha256:5b77459df20e08151cd6f8b9ef8ef1f961ef73d85c21a555c7eed5b79410ec10 \
|
||||
--hash=sha256:5ed6ab538499c8644b8a3e18debabcd7ce684f3fa91cf867521a7a0279cab2d6 \
|
||||
--hash=sha256:6178f72c5508bfc5fd446a5905e698c6212932f25bcdd4b47a757a50605a90e2 \
|
||||
--hash=sha256:64f02c6841d7d83f832cd97ccf8eb8a906d06eb95d5276069175c696b024b60a \
|
||||
--hash=sha256:67f6279d125ca0046a7fd386d01b311c6363844deac3e5b069b514ba3e63c246 \
|
||||
--hash=sha256:6c114670c45346afedc0d947faf3c7f701051d2518b943679c8ff88befe14f8e \
|
||||
--hash=sha256:708838739abf24b2ceb208d0e22403dd018faeef86ddac04319a62ae884c4f15 \
|
||||
--hash=sha256:715479b9a2802ecac752a3b0efa2b0b60285cf962ee38414211abdfccc233b41 \
|
||||
--hash=sha256:733784b6d6def852c814bce5f318d25da2ee65dd4839a0718641c696e09a2960 \
|
||||
--hash=sha256:752a45dc4a6934060b3b0dab47e04edc3326575f82be64bc4fc293914566503e \
|
||||
--hash=sha256:7579e913a5339fb8fa133f6bbcfd8e6749696206cf05acdbdca71a1b436d8e72 \
|
||||
--hash=sha256:7804338df6fcc08105c7745f1502ba68d900f45fd770d5bdd5288ddccb8a42d8 \
|
||||
--hash=sha256:80d04837f55fc81da168b98de4f4b797ef007fc8a79ab71c6ec9bc4dd662b15b \
|
||||
--hash=sha256:8778f0c7a52e56f75d12dae53ae320fae900a8b9b4164b981b9c5ce059cd1fcb \
|
||||
--hash=sha256:8d828b6667a32a728a1ad1d93957cdf37489c57b97ae6c4de2860fa749b8fc1e \
|
||||
--hash=sha256:92a0a01ead5e668468e952e4238cccd7c537364eb7d851ab144ab6627dbbe12f \
|
||||
--hash=sha256:a180c5e59792af262bf263b21a3c49353f25945d8d9f70628e73de370d55e1e1 \
|
||||
--hash=sha256:a277ab8928b9f299723bc1a2dabb1265911b1a76341f90a510368ca44ad9ab66 \
|
||||
--hash=sha256:a5fe03b42827c13cdccd08e6c0247b6a6d4b5e3cdc53fd1749f5896adcdc2356 \
|
||||
--hash=sha256:a89c23ef8d2c6b27fd200a42aa4ac72786e7c60d40efdc76e6011260b6e949c4 \
|
||||
--hash=sha256:ae89db9e5f98a11a4bf50407d4363e7b09b31e55bc117b4f7d80aab97ba009e5 \
|
||||
--hash=sha256:aed52fea0513bac0ccde438c188c8a471c4e0f457c2dd20cdbf6ea7a450046c7 \
|
||||
--hash=sha256:bb6d88045545b26da47aa879dd4a89a71d1dce0f0e549b1abcb31dfe4a8eac49 \
|
||||
--hash=sha256:bd6c2a1c7573c64738d716488d2cdd3c00e340e4835707d8fdb8dc1a66ef164e \
|
||||
--hash=sha256:c03a41a8784091e67a39648f70c5f97b5b6a37f216896d44d2cdcb82615339a0 \
|
||||
--hash=sha256:c35abb8bfff0185efac5878da64c45dafd2b37fb0383add1be155a763c1f083d \
|
||||
--hash=sha256:c36c333c39be2dbca264d7803333c896ab8fa7d4d6f0ab7edb7dfd7aea6e98c0 \
|
||||
--hash=sha256:c45e9440fb78f8ddabcf714b68f936737a121355bf59f3907f4e17721b9d1aae \
|
||||
--hash=sha256:ce3412fbe1e31eb81ea42f4169ed94861c56e643189e1e75f0041f3fe7020abe \
|
||||
--hash=sha256:cf1493cd8607bec4d8a7b9b004e699fcf8f9103a9284cc94962cb73d20f9d4a3 \
|
||||
--hash=sha256:d6038d37043bced98a66e68d3aa2b6a35505dc01328cd65217cefe82f25def44 \
|
||||
--hash=sha256:e044c39e41b92c845bc815e5ae4230804e8e7bc29e399b0437d64222d92809dd \
|
||||
--hash=sha256:e1421b502d83040e6d7fb2fb18dff63957f720da3d77b2fbd3187ceb63755d7b \
|
||||
--hash=sha256:e712b419df8ba5e42b226c510472b37bd57b38e897d3eca5e8cfd410a29fa859 \
|
||||
--hash=sha256:e74327fb75de8986940def6e8dee4f127cc9752bee7355bb323cc5b2659b6d46 \
|
||||
--hash=sha256:e8ac484bf18ce6975760921bb6148041faa8fef0547200386ea0b52b5d27bf7b \
|
||||
--hash=sha256:eca9705049ad3c7345d574e3510665cb2cf844c2f2dcfe675332677f081cbd46 \
|
||||
--hash=sha256:edac0f1ab77644605be2cbba52e6b7f630731fc42b34cb0f634be1a6eface56a \
|
||||
--hash=sha256:effc3f449787117233702311a1b7d8f59cba9ced946ba727bdc329ec69028e24 \
|
||||
--hash=sha256:f495a1652cf3fbab2eb0639776dad966c2fb874d79d87ca07f9d5f059b8bd215 \
|
||||
--hash=sha256:f496c9c3cc02230093d8330875c4c3cdfc3b73612a5fd921c65d39cbcef08063 \
|
||||
--hash=sha256:f59099f9b66f0d7145115e6f80dd8b1d847176df89b234a5a6b3f00437aa0832 \
|
||||
--hash=sha256:f59ad4c0e8f6bba240a9bb85504faa1ab438237199d4cce5f622761507b8f6a6 \
|
||||
--hash=sha256:fbccdc05410c9ee21bbf16a35f4c1d16123dcdeb8a1d38f33654fa21d0234f79 \
|
||||
--hash=sha256:fea24543955a6a729c45a73fe90e08c743f0b3334bbf3201e6c4bc1b0c7fa464
|
||||
# via requests
|
||||
colorama==0.4.6 ; sys_platform == 'win32' \
|
||||
--hash=sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44 \
|
||||
--hash=sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6
|
||||
# via
|
||||
# qrcode
|
||||
# tqdm
|
||||
constantly==23.10.4 \
|
||||
--hash=sha256:3fd9b4d1c3dc1ec9757f3c52aef7e53ad9323dbe39f51dfd4c43853b68dfa3f9 \
|
||||
--hash=sha256:aa92b70a33e2ac0bb33cd745eb61776594dc48764b06c35e0efd050b7f1c7cbd
|
||||
# via twisted
|
||||
cryptography==48.0.0 \
|
||||
--hash=sha256:0c558d2cdffd8f4bbb30fc7134c74d2ca9a476f830bb053074498fbc86f41ed6 \
|
||||
--hash=sha256:16cd65b9330583e4619939b3a3843eec1e6e789744bb01e7c7e2e62e33c239c8 \
|
||||
--hash=sha256:18349bbc56f4743c8b12dc32e2bccb2cf83ee8b69a3bba74ef8ae857e26b3d25 \
|
||||
--hash=sha256:1e2d54c8be6152856a36f0882ab231e70f8ec7f14e93cf87db8a2ed056bf160c \
|
||||
--hash=sha256:22a5cb272895dce158b2cacdfdc3debd299019659f42947dbdac6f32d68fe832 \
|
||||
--hash=sha256:27241b1dc9962e056062a8eef1991d02c3a24569c95975bd2322a8a52c6e5e12 \
|
||||
--hash=sha256:2b4d59804e8408e2fea7d1fbaf218e5ec984325221db76e6a241a9abd6cdd95c \
|
||||
--hash=sha256:2eb992bbd4661238c5a397594c83f5b4dc2bc5b848c365c8f991b6780efcc5c7 \
|
||||
--hash=sha256:369a6348999f94bbd53435c894377b20ab95f25a9065c283570e70150d8abc3c \
|
||||
--hash=sha256:3cb07a3ed6431663cd321ea8a000a1314c74211f823e4177fefa2255e057d1ec \
|
||||
--hash=sha256:40ba1f85eaa6959837b1d51c9767e230e14612eea4ef110ee8854ada22da1bf5 \
|
||||
--hash=sha256:55b7718303bf06a5753dcdccf2f3945cf18ad7bffde41b61226e4db31ab89a9c \
|
||||
--hash=sha256:561215ea3879cb1cbbf272867e2efda62476f240fb58c64de6b393ae19246741 \
|
||||
--hash=sha256:58d00498e8933e4a194f3076aee1b4a97dfec1a6da444535755822fe5d8b0b86 \
|
||||
--hash=sha256:59baa2cb386c4f0b9905bd6eb4c2a79a69a128408fd31d32ca4d7102d4156321 \
|
||||
--hash=sha256:5a5ed8fde7a1d09376ca0b40e68cd59c69fe23b1f9768bd5824f54681626032a \
|
||||
--hash=sha256:5b012212e08b8dd5edc78ef54da83dd9892fd9105323b3993eff6bea65dc21d7 \
|
||||
--hash=sha256:5c3932f4436d1cccb036cb0eaef46e6e2db91035166f1ad6505c3c9d5a635920 \
|
||||
--hash=sha256:614d0949f4790582d2cc25553abd09dd723025f0c0e7c67376a1d77196743d6e \
|
||||
--hash=sha256:76341972e1eff8b4bea859f09c0d3e64b96ce931b084f9b9b7db8ef364c30eff \
|
||||
--hash=sha256:77a2ccbbe917f6710e05ba9adaa25fb5075620bf3ea6fb751997875aff4ae4bd \
|
||||
--hash=sha256:7995ef305d7165c3f11ae07f2517e5a4f1d5c18da1376a0a9ed496336b69e5f3 \
|
||||
--hash=sha256:7ce4bfae76319a532a2dc68f82cc32f5676ee792a983187dac07183690e5c66f \
|
||||
--hash=sha256:7e8eac43dfca5c4cccc6dad9a80504436fca53bb9bc3100a2386d730fbe6b602 \
|
||||
--hash=sha256:8c7378637d7d88016fa6791c159f698b3d3eed28ebf844ac36b9dc04a14dae18 \
|
||||
--hash=sha256:8cd666227ef7af430aa5914a9910e0ddd703e75f039cef0825cd0da71b6b711a \
|
||||
--hash=sha256:906cbf0670286c6e0044156bc7d4af9cbb0ef6db9f73e52c3ec56ba6bdde5336 \
|
||||
--hash=sha256:9071196d81abc88b3516ac8cdfad32e2b66dd4a5393a8e68a961e9161ddc6239 \
|
||||
--hash=sha256:9249e3cd978541d665967ac2cb2787fd6a62bddf1e75b3e347a594d7dacf4f74 \
|
||||
--hash=sha256:984a20b0f62a26f48a3396c72e4bc34c66e356d356bf370053066b3b6d54634a \
|
||||
--hash=sha256:9be5aafa5736574f8f15f262adc81b2a9869e2cfe9014d52a44633905b40d52c \
|
||||
--hash=sha256:9c459db21422be75e2809370b829a87eb37f74cd785fc4aa9ea1e5f43b47cda4 \
|
||||
--hash=sha256:9ccdac7d40688ecb5a3b4a604b8a88c8002e3442d6c60aead1db2a89a041560c \
|
||||
--hash=sha256:a0e692c683f4df67815a2d258b324e66f4738bd7a96a218c826dce4f4bd05d8f \
|
||||
--hash=sha256:a5da777e32ffed6f85a7b2b3f7c5cbc88c146bfcd0a1d7baf5fcc6c52ee35dd4 \
|
||||
--hash=sha256:a64697c641c7b1b2178e573cbc31c7c6684cd56883a478d75143dbb7118036db \
|
||||
--hash=sha256:ad64688338ed4bc1a6618076ba75fd7194a5f1797ac60b47afe926285adb3166 \
|
||||
--hash=sha256:bd72e68b06bb1e96913f97dd4901119bc17f39d4586a5adf2d3e47bc2b9d58b5 \
|
||||
--hash=sha256:c17dfe85494deaeddc5ce251aebd1d60bbe6afc8b62071bb0b469431a000124f \
|
||||
--hash=sha256:c18684a7f0cc9a3cb60328f496b8e3372def7c5d2df39ac267878b05565aaaae \
|
||||
--hash=sha256:cc90c0b39b2e3c65ef52c804b72e3c58f8a04ab2a1871272798e5f9572c17d20 \
|
||||
--hash=sha256:ea8990436d914540a40ab24b6a77c0969695ed52f4a4874c5137ccf7045a7057 \
|
||||
--hash=sha256:f5333311663ea94f75dd408665686aaf426563556bb5283554a3539177e03b8c
|
||||
# via
|
||||
# autobahn
|
||||
# pyopenssl
|
||||
# service-identity
|
||||
daphne==4.2.1 \
|
||||
--hash=sha256:5f898e700a1fda7addf1541d7c328606415e96a7bd768405f0463c312fcb31b3 \
|
||||
--hash=sha256:881e96b387b95b35ad85acd855f229d7f5b79073d6649089c8a33f661885e055
|
||||
# via onlinejudge
|
||||
distro==1.9.0 \
|
||||
--hash=sha256:2fa77c6fd8940f116ee1d6b94a2f90b13b5ea8d019b98bc8bafdcabcdd9bdbed \
|
||||
--hash=sha256:7bffd925d65168f85027d8da9af6bddab658135b840670a223589bc0c8ef02b2
|
||||
# via openai
|
||||
django==6.0.4 \
|
||||
--hash=sha256:14359c809fc16e8f81fd2b59d7d348e4d2d799da6840b10522b6edf7b8afc1da \
|
||||
--hash=sha256:8cfa2572b3f2768b2e84983cf3c4811877a01edb64e817986ec5d60751c113ac
|
||||
# via
|
||||
# channels
|
||||
# django-cas-ng
|
||||
# django-dramatiq
|
||||
# django-redis
|
||||
# djangorestframework
|
||||
# onlinejudge
|
||||
# sentry-sdk
|
||||
django-cas-ng==5.1.1 \
|
||||
--hash=sha256:a1839aed955fc756ee35a479cb18eb3dd1912613888bdade069bcc4c405adb79 \
|
||||
--hash=sha256:c89a4be2d24ab3fbcab3e59c212a3347a42840b0ad2677036b5655003ad4840c
|
||||
# via onlinejudge
|
||||
django-dbconn-retry==0.3.1 \
|
||||
--hash=sha256:d4b64d915440c3e5902ef8edf836366a6f4c4f027d34902135d7335233d6dbba
|
||||
# via onlinejudge
|
||||
django-dramatiq==0.15.0 \
|
||||
--hash=sha256:23f0bc418a860952adbf822c4aa3b9c46c51d3d9f50be0a8ed3d19a53380df1d \
|
||||
--hash=sha256:e3cf1b2ac288fe4a7aa198c9450fe242ed312df8850f3f9e18ce01b8acc78b96
|
||||
# via onlinejudge
|
||||
django-redis==6.0.0 \
|
||||
--hash=sha256:20bf0063a8abee567eb5f77f375143c32810c8700c0674ced34737f8de4e36c0 \
|
||||
--hash=sha256:2d9cb12a20424a4c4dde082c6122f486628bae2d9c2bee4c0126a4de7fda00dd
|
||||
# via onlinejudge
|
||||
djangorestframework==3.17.1 \
|
||||
--hash=sha256:a6def5f447fe78ff853bff1d47a3c59bf38f5434b031780b351b0c73a62db1a5 \
|
||||
--hash=sha256:c3c74dd3e83a5a3efc37b3c18d92bd6f86a6791c7b7d4dff62bb068500e76457
|
||||
# via onlinejudge
|
||||
dramatiq==2.1.0 \
|
||||
--hash=sha256:3ef940c2815722d3679aed79ef96c805f02fd33d4361529b2de30f01511ca44d \
|
||||
--hash=sha256:cf81550729de6cf64234b05bd63970645654aaf38967faa7a2b6e401384bb090
|
||||
# via
|
||||
# django-dramatiq
|
||||
# onlinejudge
|
||||
gunicorn==26.0.0 \
|
||||
--hash=sha256:40233d26a5f0d1872916188c276e21641155111c2853f0c2cd55260aec0d24fc \
|
||||
--hash=sha256:ca9346f85e3a4aeeb64d491045c16b9a35647abd37ea15efe53080eb8b090baf
|
||||
# via onlinejudge
|
||||
h11==0.16.0 \
|
||||
--hash=sha256:4e35b956cf45792e4caa5885e69fba00bdbc6ffafbfa020300e549b208ee5ff1 \
|
||||
--hash=sha256:63cf8bbe7522de3bf65932fda1d9c2772064ffb3dae62d55932da54b31cb6c86
|
||||
# via httpcore
|
||||
httpcore==1.0.9 \
|
||||
--hash=sha256:2d400746a40668fc9dec9810239072b40b4484b640a8c38fd654a024c7a1bf55 \
|
||||
--hash=sha256:6e34463af53fd2ab5d807f399a9b45ea31c3dfa2276f15a2c3f00afff6e176e8
|
||||
# via httpx
|
||||
httpx==0.28.1 \
|
||||
--hash=sha256:75e98c5f16b0f35b567856f597f06ff2270a374470a5c2392242528e3e3e42fc \
|
||||
--hash=sha256:d909fcccc110f8c7faf814ca82a9a4d816bc5a6dbfea25d6591d6985b8ba59ad
|
||||
# via openai
|
||||
hyperlink==21.0.0 \
|
||||
--hash=sha256:427af957daa58bc909471c6c40f74c5450fa123dd093fc53efd2e91d2705a56b \
|
||||
--hash=sha256:e6b14c37ecb73e89c77d78cdb4c2cc8f3fb59a885c5b3f819ff4ed80f25af1b4
|
||||
# via
|
||||
# autobahn
|
||||
# twisted
|
||||
idna==3.13 \
|
||||
--hash=sha256:585ea8fe5d69b9181ec1afba340451fba6ba764af97026f92a91d4eef164a242 \
|
||||
--hash=sha256:892ea0cde124a99ce773decba204c5552b69c3c67ffd5f232eb7696135bc8bb3
|
||||
# via
|
||||
# anyio
|
||||
# httpx
|
||||
# hyperlink
|
||||
# requests
|
||||
# twisted
|
||||
incremental==24.11.0 \
|
||||
--hash=sha256:87d3480dbb083c1d736222511a8cf380012a8176c2456d01ef483242abbbcf8c \
|
||||
--hash=sha256:a34450716b1c4341fe6676a0598e88a39e04189f4dce5dc96f656e040baa10b3
|
||||
# via twisted
|
||||
jiter==0.14.0 \
|
||||
--hash=sha256:004df5fdb8ecbd6d99f3227df18ba1a259254c4359736a2e6f036c944e02d7c5 \
|
||||
--hash=sha256:14c0cb10337c49f5eafe8e7364daca5e29a020ea03580b8f8e6c597fed4e1588 \
|
||||
--hash=sha256:1aca29ba52913f78362ec9c2da62f22cdc4c3083313403f90c15460979b84d9b \
|
||||
--hash=sha256:1bf7ff85517dd2f20a5750081d2b75083c1b269cf75afc7511bdf1f9548beb3b \
|
||||
--hash=sha256:215a6cb8fb7dc702aa35d475cc00ddc7f970e5c0b1417fb4b4ac5d82fa2a29db \
|
||||
--hash=sha256:2492e5f06c36a976d25c7cc347a60e26d5470178d44cde1b9b75e60b4e519f28 \
|
||||
--hash=sha256:260bf7ca20704d58d41f669e5e9fe7fe2fa72901a6b324e79056f5d52e9c9be2 \
|
||||
--hash=sha256:26679d58ba816f88c3849306dd58cb863a90a1cf352cdd4ef67e30ccf8a77994 \
|
||||
--hash=sha256:2e692633a12cda97e352fdcd1c4acc971b1c28707e1e33aeef782b0cbf051975 \
|
||||
--hash=sha256:2f7877ed45118de283786178eceaf877110abacd04fde31efff3940ae9672674 \
|
||||
--hash=sha256:2fb2ce3a7bc331256dfb14cefc34832366bb28a9aca81deaf43bbf2a5659e607 \
|
||||
--hash=sha256:33a20d838b91ef376b3a56896d5b04e725c7df5bc4864cc6569cf046a8d73b6d \
|
||||
--hash=sha256:34f19dcc35cb1abe7c369b3756babf8c7f04595c0807a848df8f26ef8298ef92 \
|
||||
--hash=sha256:351bf6eda4e3a7ceb876377840c702e9a3e4ecc4624dbfb2d6463c67ae52637d \
|
||||
--hash=sha256:37826e3df29e60f30a382f9294348d0238ef127f4b5d7f5f8da78b5b9e050560 \
|
||||
--hash=sha256:3a99c1387b1f2928f799a9de899193484d66206a50e98233b6b088a7f0c1edb2 \
|
||||
--hash=sha256:432c4db5255d86a259efde91e55cb4c8d18c0521d844c9e2e7efcce3899fb016 \
|
||||
--hash=sha256:4b77da71f6e819be5fbcec11a453fde5b1d0267ef6ed487e2a392fd8e14e4e3a \
|
||||
--hash=sha256:5252a7ca23785cef5d02d4ece6077a1b556a410c591b379f82091c3001e14844 \
|
||||
--hash=sha256:5419d4aa2024961da9fe12a9cfe7484996735dca99e8e090b5c88595ef1951ff \
|
||||
--hash=sha256:5dec7c0a3e98d2a3f8a2e67382d0d7c3ac60c69103a4b271da889b4e8bb1e129 \
|
||||
--hash=sha256:6112f26f5afc75bcb475787d29da3aa92f9d09c7858f632f4be6ffe607be82e9 \
|
||||
--hash=sha256:62fe2451f8fcc0240261e6a4df18ecbcd58327857e61e625b2393ea3b468aac9 \
|
||||
--hash=sha256:645be49c46f2900937ba0eaf871ad5183c96858c0af74b6becc7f4e367e36e06 \
|
||||
--hash=sha256:651a8758dd413c51e3b7f6557cdc6921faf70b14106f45f969f091f5cda990ea \
|
||||
--hash=sha256:67f00d94b281174144d6532a04b66a12cb866cbdc47c3af3bfe2973677f9861a \
|
||||
--hash=sha256:6f396837fc7577871ca8c12edaf239ed9ccef3bbe39904ae9b8b63ce0a48b140 \
|
||||
--hash=sha256:7282342d32e357543565286b6450378c3cd402eea333fc1ebe146f1fabb306fc \
|
||||
--hash=sha256:7609cfbe3a03d37bfdbf5052012d5a879e72b83168a363deae7b3a26564d57de \
|
||||
--hash=sha256:77f4ea612fe8b84b8b04e51d0e78029ecf3466348e25973f953de6e6a59aa4c1 \
|
||||
--hash=sha256:78d918a68b26e9fab068c2b5453577ef04943ab2807b9a6275df2a812599a310 \
|
||||
--hash=sha256:7b25beaa0d4447ea8c7ae0c18c688905d34840d7d0b937f2f7bdd52162c98a40 \
|
||||
--hash=sha256:7d9d51eb96c82a9652933bd769fe6de66877d6eb2b2440e281f2938c51b5643e \
|
||||
--hash=sha256:7ede4331a1899d604463369c730dbb961ffdc5312bc7f16c41c2896415b1304a \
|
||||
--hash=sha256:801028dcfc26ac0895e4964cbc0fd62c73be9fd4a7d7b1aaf6e5790033a719b7 \
|
||||
--hash=sha256:80381f5a19af8fa9aef743f080e34f6b25ebd89656475f8cf0470ec6157052aa \
|
||||
--hash=sha256:882bcb9b334318e233950b8be366fe5f92c86b66a7e449e76975dfd6d776a01f \
|
||||
--hash=sha256:8b39b7d87a952b79949af5fef44d2544e58c21a28da7f1bae3ef166455c61746 \
|
||||
--hash=sha256:92cd8b6025981a041f5310430310b55b25ca593972c16407af8837d3d7d2ca01 \
|
||||
--hash=sha256:9b8c571a5dba09b98bd3462b5a53f27209a5cbbe85670391692ede71974e979f \
|
||||
--hash=sha256:a4d50ea3d8ba4176f79754333bd35f1bbcd28e91adc13eb9b7ca91bc52a6cef9 \
|
||||
--hash=sha256:ab18d11074485438695f8d34a1b6da61db9754248f96d51341956607a8f39985 \
|
||||
--hash=sha256:ad425b087aafb4a1c7e1e98a279200743b9aaf30c3e0ba723aec93f061bd9bc8 \
|
||||
--hash=sha256:ae039aaef8de3f8157ecc1fdd4d85043ac4f57538c245a0afaecb8321ec951c3 \
|
||||
--hash=sha256:af72f204cf4d44258e5b4c1745130ac45ddab0e71a06333b01de660ab4187a94 \
|
||||
--hash=sha256:b08997c35aee1201c1a5361466a8fb9162d03ae7bf6568df70b6c859f1e654a4 \
|
||||
--hash=sha256:bd77945f38866a448e73b0b7637366afa814d4617790ecd88a18ca74377e6c02 \
|
||||
--hash=sha256:be808176a6a3a14321d18c603f2d40741858a7c4fc982f83232842689fe86dd9 \
|
||||
--hash=sha256:c1dcfbeb93d9ecd9ca128bbf8910120367777973fa193fb9a39c31237d8df165 \
|
||||
--hash=sha256:c409578cbd77c338975670ada777add4efd53379667edf0aceea730cabede6fb \
|
||||
--hash=sha256:c8ef8791c3e78d6c6b157c6d360fbb5c715bebb8113bc6a9303c5caff012754a \
|
||||
--hash=sha256:ce17f8a050447d1b4153bda4fb7d26e6a9e74eb4f4a41913f30934c5075bf615 \
|
||||
--hash=sha256:cff5708f7ed0fa098f2b53446c6fa74c48469118e5cd7497b4f1cd569ab06928 \
|
||||
--hash=sha256:d824ca4148b705970bf4e120924a212fdfca9859a73e42bd7889a63a4ea6bb98 \
|
||||
--hash=sha256:e1a7eead856a5038a8d291f1447176ab0b525c77a279a058121b5fccee257f6f \
|
||||
--hash=sha256:e74663b8b10da1fe0f4e4703fd7980d24ad17174b6bb35d8498d6e3ebce2ae6a \
|
||||
--hash=sha256:e89bcd7d426a75bb4952c696b267075790d854a07aad4c9894551a82c5b574ab \
|
||||
--hash=sha256:e8a39e66dac7153cf3f964a12aad515afa8d74938ec5cc0018adcdae5367c79e \
|
||||
--hash=sha256:ee4a72f12847ef29b072aee9ad5474041ab2924106bdca9fcf5d7d965853e057 \
|
||||
--hash=sha256:f2d4c61da0821ee42e0cdf5489da60a6d074306313a377c2b35af464955a3611 \
|
||||
--hash=sha256:f4f1c4b125e1652aefbc2e2c1617b60a160ab789d180e3d423c41439e5f32850 \
|
||||
--hash=sha256:fbd9e482663ca9d005d051330e4d2d8150bb208a209409c10f7e7dfdf7c49da9 \
|
||||
--hash=sha256:fc4ab96a30fb3cb2c7e0cd33f7616c8860da5f5674438988a54ac717caccdbaa \
|
||||
--hash=sha256:fc7e37b4b8bc7e80a63ad6cfa5fc11fab27dbfea4cc4ae644b1ab3f273dc348f \
|
||||
--hash=sha256:ff3a6465b3a0f54b1a430f45c3c0ba7d61ceb45cbc3e33f9e1a7f638d690baf3
|
||||
# via openai
|
||||
lxml==6.1.0 \
|
||||
--hash=sha256:00750d63ef0031a05331b9223463b1c7c02b9004cef2346a5b2877f0f9494dd2 \
|
||||
--hash=sha256:022981127642fe19866d2907d76241bb07ed21749601f727d5d5dd1ce5d1b773 \
|
||||
--hash=sha256:05b9b8787e35bec69e68daf4952b2e6dfcfb0db7ecf1a06f8cdfbbac4eb71aad \
|
||||
--hash=sha256:0f0f08beb0182e3e9a86fae124b3c47a7b41b7b69b225e1377db983802404e54 \
|
||||
--hash=sha256:1081dd10bc6fa437db2500e13993abf7cc30716d0a2f40e65abb935f02ec559c \
|
||||
--hash=sha256:1ae225f66e5938f4fa29d37e009a3bb3b13032ac57eb4eb42afa44f6e4054e69 \
|
||||
--hash=sha256:2173a7bffe97667bbf0767f8a99e587740a8c56fdf3befac4b09cb29a80276fd \
|
||||
--hash=sha256:21c3302068f50d1e8728c67c87ba92aa87043abee517aa2576cca1855326b405 \
|
||||
--hash=sha256:23cad0cc86046d4222f7f418910e46b89971c5a45d3c8abfad0f64b7b05e4a9b \
|
||||
--hash=sha256:264c605ab9c0e4aa1a679636f4582c4d3313700009fac3ec9c3412ed0d8f3e1d \
|
||||
--hash=sha256:26dd9f57ee3bd41e7d35b4c98a2ffd89ed11591649f421f0ec19f67d50ec67ac \
|
||||
--hash=sha256:28902146ffbe5222df411c5d19e5352490122e14447e98cd118907ee3fd6ee62 \
|
||||
--hash=sha256:30e7b2ed63b6c8e97cca8af048589a788ab5c9c905f36d9cf1c2bb549f450d2f \
|
||||
--hash=sha256:32662519149fd7a9db354175aa5e417d83485a8039b8aaa62f873ceee7ea4cad \
|
||||
--hash=sha256:3648f20d25102a22b6061c688beb3a805099ea4beb0a01ce62975d926944d292 \
|
||||
--hash=sha256:37fabd1452852636cf38ecdcc9dd5ca4bba7a35d6c53fa09725deeb894a87491 \
|
||||
--hash=sha256:398443df51c538bd578529aa7e5f7afc6c292644174b47961f3bf87fe5741120 \
|
||||
--hash=sha256:3f00972f84450204cd5d93a5395965e348956aaceaadec693a22ec743f8ae3eb \
|
||||
--hash=sha256:40d9189f80075f2e1f88db21ef815a2b17b28adf8e50aaf5c789bfe737027f32 \
|
||||
--hash=sha256:419c58fc92cc3a2c3fa5f78c63dbf5da70c1fa9c1b25f25727ecee89a96c7de2 \
|
||||
--hash=sha256:43e4d297f11080ec9d64a4b1ad7ac02b4484c9f0e2179d9c4ef78e886e747b88 \
|
||||
--hash=sha256:45e9dfbd1b661eb64ba0d4dbe762bd210c42d86dd1e5bd2bdf89d634231beb43 \
|
||||
--hash=sha256:47024feaae386a92a146af0d2aeed65229bf6fff738e6a11dda6b0015fb8fd03 \
|
||||
--hash=sha256:4937460dc5df0cdd2f06a86c285c28afda06aefa3af949f9477d3e8df430c485 \
|
||||
--hash=sha256:4a1503c56e4e2b38dc76f2f2da7bae69670c0f1933e27cfa34b2fa5876410b16 \
|
||||
--hash=sha256:4b89b098105b8599dc57adac95d1813409ac476d3c948a498775d3d0c6124bfb \
|
||||
--hash=sha256:4bd1bdb8a9e0e2dd229de19b5f8aebac80e916921b4b2c6ef8a52bc131d0c1f9 \
|
||||
--hash=sha256:56971379bc5ee8037c5a0f09fa88f66cdb7d37c3e38af3e45cf539f41131ac1f \
|
||||
--hash=sha256:5715e0e28736a070f3f34a7ccc09e2fdcba0e3060abbcf61a1a5718ff6d6b105 \
|
||||
--hash=sha256:5d27bbe326c6b539c64b42638b18bc6003a8d88f76213a97ac9ed4f885efeab7 \
|
||||
--hash=sha256:63aeafc26aac0be8aff14af7871249e87ea1319be92090bfd632ec68e03b16a5 \
|
||||
--hash=sha256:690022c7fae793b0489aa68a658822cea83e0d5933781811cabbf5ea3bcfe73d \
|
||||
--hash=sha256:6fd8b1df8254ff4fd93fd31da1fc15770bde23ac045be9bb1f87425702f61cc9 \
|
||||
--hash=sha256:73becf6d8c81d4c76b1014dbd3584cb26d904492dcf73ca85dc8bff08dcd6d2d \
|
||||
--hash=sha256:73d658216fc173cf2c939e90e07b941c5e12736b0bf6a99e7af95459cfe8eabb \
|
||||
--hash=sha256:75c4c7c619a744f972f4451bf5adf6d0fb00992a1ffc9fd78e13b0bc817cc99f \
|
||||
--hash=sha256:77b9f99b17cbf14026d1e618035077060fc7195dd940d025149f3e2e830fbfcb \
|
||||
--hash=sha256:7e39ab3a28af7784e206d8606ec0e4bcad0190f63a492bca95e94e5a4aef7f6e \
|
||||
--hash=sha256:7f4a77d6f7edf9230cee3e1f7f6764722a41604ee5681844f18db9a81ea0ec33 \
|
||||
--hash=sha256:80410c3a7e3c617af04de17caa9f9f20adaa817093293d69eae7d7d0522836f5 \
|
||||
--hash=sha256:89e8d73d09ac696a5ba42ec69787913d53284f12092f651506779314f10ba585 \
|
||||
--hash=sha256:8c8984e1d8c4b3949e419158fda14d921ff703a9ed8a47236c6eb7a2b6cb4946 \
|
||||
--hash=sha256:8e369cbd690e788c8d15e56222d91a09c6a417f49cbc543040cba0fe2e25a79e \
|
||||
--hash=sha256:9147d8e386ec3b82c3b15d88927f734f565b0aaadef7def562b853adca45784a \
|
||||
--hash=sha256:972a6451204798675407beaad97b868d0c733d9a74dafefc63120b81b8c2de28 \
|
||||
--hash=sha256:97faa0860e13b05b15a51fb4986421ef7a30f0b3334061c416e0981e9450ca4c \
|
||||
--hash=sha256:9e7b0a4ca6dcc007a4cef00a761bba2dea959de4bd2df98f926b33c92ca5dfb9 \
|
||||
--hash=sha256:9eb667bf50856c4a58145f8ca2d5e5be160191e79eb9e30855a476191b3c3495 \
|
||||
--hash=sha256:a0092f2b107b69601adf562a57c956fbb596e05e3e6651cabd3054113b007e45 \
|
||||
--hash=sha256:a2853c8b2170cc6cd54a6b4d50d2c1a8a7aeca201f23804b4898525c7a152cfc \
|
||||
--hash=sha256:ab863fd37458fed6456525f297d21239d987800c46e67da5ef04fc6b3dd93ac8 \
|
||||
--hash=sha256:ac4db068889f8772a4a698c5980ec302771bb545e10c4b095d4c8be26749616f \
|
||||
--hash=sha256:bba078de0031c219e5dd06cf3e6bf8fb8e6e64a77819b358f53bb132e3e03366 \
|
||||
--hash=sha256:bc783ee3147e60a25aa0445ea82b3e8aabb83b240f2b95d32cb75587ff781814 \
|
||||
--hash=sha256:be10838781cb3be19251e276910cd508fe127e27c3242e50521521a0f3781690 \
|
||||
--hash=sha256:bfd57d8008c4965709a919c3e9a98f76c2c7cb319086b3d26858250620023b13 \
|
||||
--hash=sha256:c3592631e652afa34999a088f98ba7dfc7d6aff0d535c410bea77a71743f3819 \
|
||||
--hash=sha256:c4a699432846df86cc3de502ee85f445ebad748a1c6021d445f3e514d2cd4b1c \
|
||||
--hash=sha256:c4e425db0c5445ef0ad56b0eec54f89b88b2d884656e536a90b2f52aecb4ca86 \
|
||||
--hash=sha256:c6854e9cf99c84beb004eecd7d3a3868ef1109bf2b1df92d7bc11e96a36c2180 \
|
||||
--hash=sha256:cbd7b79cdcb4986ad78a2662625882747f09db5e4cd7b2ae178a88c9c51b3dfe \
|
||||
--hash=sha256:cc16682cc987a3da00aa56a3aa3075b08edb10d9b1e476938cfdbee8f3b67181 \
|
||||
--hash=sha256:d2f17a16cd8751e8eb233a7e41aecdf8e511712e00088bf9be455f604cd0d28d \
|
||||
--hash=sha256:d6d8efe71429635f0559579092bb5e60560d7b9115ee38c4adbea35632e7fa24 \
|
||||
--hash=sha256:dabecc48db5f42ba348d1f5d5afdc54c6c4cc758e676926c7cd327045749517d \
|
||||
--hash=sha256:e0af85773850417d994d019741239b901b22c6680206f46a34766926e466141d \
|
||||
--hash=sha256:e3dd5fe19c9e0ac818a9c7f132a5e43c1339ec1cbbfecb1a938bd3a47875b7c9 \
|
||||
--hash=sha256:e69aa6805905807186eb00e66c6d97a935c928275182eb02ee40ba00da9623b2 \
|
||||
--hash=sha256:ebe33f4ec1b2de38ceb225a1749a2965855bffeef435ba93cd2d5d540783bf2f \
|
||||
--hash=sha256:f0cea5b1d3e6e77d71bd2b9972eb2446221a69dc52bb0b9c3c6f6e5700592d93 \
|
||||
--hash=sha256:fc46da94826188ed45cb53bd8e3fc076ae22675aea2087843d4735627f867c6d \
|
||||
--hash=sha256:fc7140d7a7386e6b545d41b7358f4d02b656d4053f5fa6859f92f4b9c2572c4d \
|
||||
--hash=sha256:fe022f20bc4569ec66b63b3fb275a3d628d9d32da6326b2982584104db6d3086
|
||||
# via python-cas
|
||||
msgpack==1.1.2 \
|
||||
--hash=sha256:04fb995247a6e83830b62f0b07bf36540c213f6eac8e851166d8d86d83cbd014 \
|
||||
--hash=sha256:180759d89a057eab503cf62eeec0aa61c4ea1200dee709f3a8e9397dbb3b6931 \
|
||||
--hash=sha256:1d1418482b1ee984625d88aa9585db570180c286d942da463533b238b98b812b \
|
||||
--hash=sha256:1de460f0403172cff81169a30b9a92b260cb809c4cb7e2fc79ae8d0510c78b6b \
|
||||
--hash=sha256:1fdf7d83102bf09e7ce3357de96c59b627395352a4024f6e2458501f158bf999 \
|
||||
--hash=sha256:1fff3d825d7859ac888b0fbda39a42d59193543920eda9d9bea44d958a878029 \
|
||||
--hash=sha256:2929af52106ca73fcb28576218476ffbb531a036c2adbcf54a3664de124303e9 \
|
||||
--hash=sha256:372839311ccf6bdaf39b00b61288e0557916c3729529b301c52c2d88842add42 \
|
||||
--hash=sha256:3b60763c1373dd60f398488069bcdc703cd08a711477b5d480eecc9f9626f47e \
|
||||
--hash=sha256:42eefe2c3e2af97ed470eec850facbe1b5ad1d6eacdbadc42ec98e7dcf68b4b7 \
|
||||
--hash=sha256:446abdd8b94b55c800ac34b102dffd2f6aa0ce643c55dfc017ad89347db3dbdb \
|
||||
--hash=sha256:4efd7b5979ccb539c221a4c4e16aac1a533efc97f3b759bb5a5ac9f6d10383bf \
|
||||
--hash=sha256:5559d03930d3aa0f3aacb4c42c776af1a2ace2611871c84a75afe436695e6245 \
|
||||
--hash=sha256:5928604de9b032bc17f5099496417f113c45bc6bc21b5c6920caf34b3c428794 \
|
||||
--hash=sha256:59415c6076b1e30e563eb732e23b994a61c159cec44deaf584e5cc1dd662f2af \
|
||||
--hash=sha256:5a46bf7e831d09470ad92dff02b8b1ac92175ca36b087f904a0519857c6be3ff \
|
||||
--hash=sha256:6c15b7d74c939ebe620dd8e559384be806204d73b4f9356320632d783d1f7939 \
|
||||
--hash=sha256:70a0dff9d1f8da25179ffcf880e10cf1aad55fdb63cd59c9a49a1b82290062aa \
|
||||
--hash=sha256:70c5a7a9fea7f036b716191c29047374c10721c389c21e9ffafad04df8c52c90 \
|
||||
--hash=sha256:80a0ff7d4abf5fecb995fcf235d4064b9a9a8a40a3ab80999e6ac1e30b702717 \
|
||||
--hash=sha256:897c478140877e5307760b0ea66e0932738879e7aa68144d9b78ea4c8302a84a \
|
||||
--hash=sha256:8e22ab046fa7ede9e36eeb4cfad44d46450f37bb05d5ec482b02868f451c95e2 \
|
||||
--hash=sha256:99e2cb7b9031568a2a5c73aa077180f93dd2e95b4f8d3b8e14a73ae94a9e667e \
|
||||
--hash=sha256:9ade919fac6a3e7260b7f64cea89df6bec59104987cbea34d34a2fa15d74310b \
|
||||
--hash=sha256:a465f0dceb8e13a487e54c07d04ae3ba131c7c5b95e2612596eafde1dccf64a9 \
|
||||
--hash=sha256:a668204fa43e6d02f89dbe79a30b0d67238d9ec4c5bd8a940fc3a004a47b721b \
|
||||
--hash=sha256:a7787d353595c7c7e145e2331abf8b7ff1e6673a6b974ded96e6d4ec09f00c8c \
|
||||
--hash=sha256:be52a8fc79e45b0364210eef5234a7cf8d330836d0a64dfbb878efa903d84620 \
|
||||
--hash=sha256:be5980f3ee0e6bd44f3a9e9dea01054f175b50c3e6cdb692bc9424c0bbb8bf69 \
|
||||
--hash=sha256:c63eea553c69ab05b6747901b97d620bb2a690633c77f23feb0c6a947a8a7b8f \
|
||||
--hash=sha256:d62ce1f483f355f61adb5433ebfd8868c5f078d1a52d042b0a998682b4fa8c27 \
|
||||
--hash=sha256:d99ef64f349d5ec3293688e91486c5fdb925ed03807f64d98d205d2713c60b46 \
|
||||
--hash=sha256:e23ce8d5f7aa6ea6d2a2b326b4ba46c985dbb204523759984430db7114f8aa00 \
|
||||
--hash=sha256:e69b39f8c0aa5ec24b57737ebee40be647035158f14ed4b40e6f150077e21a84 \
|
||||
--hash=sha256:f2cb069d8b981abc72b41aea1c580ce92d57c673ec61af4c500153a626cb9e20 \
|
||||
--hash=sha256:fac4be746328f90caa3cd4bc67e6fe36ca2bf61d5c6eb6d895b6527e3f05071e \
|
||||
--hash=sha256:fffee09044073e69f2bad787071aeec727183e7580443dfeb8556cbf1978d162
|
||||
# via
|
||||
# autobahn
|
||||
# channels-redis
|
||||
openai==2.34.0 \
|
||||
--hash=sha256:828b4efcbb126352c2b5eb97d33ae890c92a71ab72511aefc1b7fe64aeccb07b \
|
||||
--hash=sha256:c996a71b1a210f3569844572ad4c609307e978515fb76877cf449b72596e549e
|
||||
# via onlinejudge
|
||||
otpauth==2.2.1 \
|
||||
--hash=sha256:169a7adbd715fca687f6a66d02ccdbefc229fb49f8a634b958d286f908134d59 \
|
||||
--hash=sha256:b7eabe0ed91cb67eb3054b7f517e4b4a7495fb30eaf2951897d41c8feef5de73
|
||||
# via onlinejudge
|
||||
packaging==26.2 \
|
||||
--hash=sha256:5fc45236b9446107ff2415ce77c807cee2862cb6fac22b8a73826d0693b0980e \
|
||||
--hash=sha256:ff452ff5a3e828ce110190feff1178bb1f2ea2281fa2075aadb987c2fb221661
|
||||
# via
|
||||
# gunicorn
|
||||
# incremental
|
||||
pillow==12.2.0 \
|
||||
--hash=sha256:00a2865911330191c0b818c59103b58a5e697cae67042366970a6b6f1b20b7f9 \
|
||||
--hash=sha256:01afa7cf67f74f09523699b4e88c73fb55c13346d212a59a2db1f86b0a63e8c5 \
|
||||
--hash=sha256:03e7e372d5240cc23e9f07deca4d775c0817bffc641b01e9c3af208dbd300987 \
|
||||
--hash=sha256:03f6fab9219220f041c74aeaa2939ff0062bd5c364ba9ce037197f4c6d498cd9 \
|
||||
--hash=sha256:042db20a421b9bafecc4b84a8b6e444686bd9d836c7fd24542db3e7df7baad9b \
|
||||
--hash=sha256:0a34329707af4f73cf1782a36cd2289c0368880654a2c11f027bcee9052d35dd \
|
||||
--hash=sha256:144748b3af2d1b358d41286056d0003f47cb339b8c43a9ea42f5fea4d8c66b6e \
|
||||
--hash=sha256:1610dd6c61621ae1cf811bef44d77e149ce3f7b95afe66a4512f8c59f25d9ebe \
|
||||
--hash=sha256:1e1757442ed87f4912397c6d35a0db6a7b52592156014706f17658ff58bbf795 \
|
||||
--hash=sha256:22db17c68434de69d8ecfc2fe821569195c0c373b25cccb9cbdacf2c6e53c601 \
|
||||
--hash=sha256:2bb4a8d594eacdfc59d9e5ad972aa8afdd48d584ffd5f13a937a664c3e7db0ed \
|
||||
--hash=sha256:2c727a6d53cb0018aadd8018c2b938376af27914a68a492f59dfcaca650d5eea \
|
||||
--hash=sha256:2d192a155bbcec180f8564f693e6fd9bccff5a7af9b32e2e4bf8c9c69dbad6b5 \
|
||||
--hash=sha256:2e5a76d03a6c6dcef67edabda7a52494afa4035021a79c8558e14af25313d453 \
|
||||
--hash=sha256:325ca0528c6788d2a6c3d40e3568639398137346c3d6e66bb61db96b96511c98 \
|
||||
--hash=sha256:390ede346628ccc626e5730107cde16c42d3836b89662a115a921f28440e6a3b \
|
||||
--hash=sha256:3adc9215e8be0448ed6e814966ecf3d9952f0ea40eb14e89a102b87f450660d8 \
|
||||
--hash=sha256:4bfd07bc812fbd20395212969e41931001fd59eb55a60658b0e5710872e95286 \
|
||||
--hash=sha256:4e6c62e9d237e9b65fac06857d511e90d8461a32adcc1b9065ea0c0fa3a28150 \
|
||||
--hash=sha256:50d8520da2a6ce0af445fa6d648c4273c3eeefbc32d7ce049f22e8b5c3daecc2 \
|
||||
--hash=sha256:56b25336f502b6ed02e889f4ece894a72612fe885889a6e8c4c80239ff6e5f5f \
|
||||
--hash=sha256:57850958fe9c751670e49b2cecf6294acc99e562531f4bd317fa5ddee2068463 \
|
||||
--hash=sha256:58f62cc0f00fd29e64b29f4fd923ffdb3859c9f9e6105bfc37ba1d08994e8940 \
|
||||
--hash=sha256:5c0a9f29ca8e79f09de89293f82fc9b0270bb4af1d58bc98f540cc4aedf03166 \
|
||||
--hash=sha256:5cdfebd752ec52bf5bb4e35d9c64b40826bc5b40a13df7c3cda20a2c03a0f5ed \
|
||||
--hash=sha256:5d2fd0fa6b5d9d1de415060363433f28da8b1526c1c129020435e186794b3795 \
|
||||
--hash=sha256:62f5409336adb0663b7caa0da5c7d9e7bdbaae9ce761d34669420c2a801b2780 \
|
||||
--hash=sha256:632ff19b2778e43162304d50da0181ce24ac5bb8180122cbe1bf4673428328c7 \
|
||||
--hash=sha256:6562ace0d3fb5f20ed7290f1f929cae41b25ae29528f2af1722966a0a02e2aa1 \
|
||||
--hash=sha256:6a6e67ea2e6feda684ed370f9a1c52e7a243631c025ba42149a2cc5934dec295 \
|
||||
--hash=sha256:6a9adfc6d24b10f89588096364cc726174118c62130c817c2837c60cf08a392b \
|
||||
--hash=sha256:6bb77b2dcb06b20f9f4b4a8454caa581cd4dd0643a08bacf821216a16d9c8354 \
|
||||
--hash=sha256:7371b48c4fa448d20d2714c9a1f775a81155050d383333e0a6c15b1123dda005 \
|
||||
--hash=sha256:766cef22385fa1091258ad7e6216792b156dc16d8d3fa607e7545b2b72061f1c \
|
||||
--hash=sha256:7b14cc0106cd9aecda615dd6903840a058b4700fcb817687d0ee4fc8b6e389be \
|
||||
--hash=sha256:7f84204dee22a783350679a0333981df803dac21a0190d706a50475e361c93f5 \
|
||||
--hash=sha256:8023abc91fba39036dbce14a7d6535632f99c0b857807cbbbf21ecc9f4717f06 \
|
||||
--hash=sha256:80b2da48193b2f33ed0c32c38140f9d3186583ce7d516526d462645fd98660ae \
|
||||
--hash=sha256:8297651f5b5679c19968abefd6bb84d95fe30ef712eb1b2d9b2d31ca61267f4c \
|
||||
--hash=sha256:88ddbc66737e277852913bd1e07c150cc7bb124539f94c4e2df5344494e0a612 \
|
||||
--hash=sha256:8cbeb542b2ebc6fcdacabf8aca8c1a97c9b3ad3927d46b8723f9d4f033288a0f \
|
||||
--hash=sha256:8e9c4f5b3c546fa3458a29ab22646c1c6c787ea8f5ef51300e5a60300736905e \
|
||||
--hash=sha256:9aba9a17b623ef750a4d11b742cbafffeb48a869821252b30ee21b5e91392c50 \
|
||||
--hash=sha256:9f08483a632889536b8139663db60f6724bfcb443c96f1b18855860d7d5c0fd4 \
|
||||
--hash=sha256:a830b1a40919539d07806aa58e1b114df53ddd43213d9c8b75847eee6c0182b5 \
|
||||
--hash=sha256:aa88ccfe4e32d362816319ed727a004423aab09c5cea43c01a4b435643fa34eb \
|
||||
--hash=sha256:af73337013e0b3b46f175e79492d96845b16126ddf79c438d7ea7ff27783a414 \
|
||||
--hash=sha256:b1c1fbd8a5a1af3412a0810d060a78b5136ec0836c8a4ef9aa11807f2a22f4e1 \
|
||||
--hash=sha256:b86024e52a1b269467a802258c25521e6d742349d760728092e1bc2d135b4d76 \
|
||||
--hash=sha256:d362d1878f00c142b7e1a16e6e5e780f02be8195123f164edf7eddd911eefe7c \
|
||||
--hash=sha256:d5d38f1411c0ed9f97bcb49b7bd59b6b7c314e0e27420e34d99d844b9ce3b6f3 \
|
||||
--hash=sha256:dac8d77255a37e81a2efcbd1fc05f1c15ee82200e6c240d7e127e25e365c39ea \
|
||||
--hash=sha256:dd025009355c926a84a612fecf58bb315a3f6814b17ead51a8e48d3823d9087f \
|
||||
--hash=sha256:deede7c263feb25dba4e82ea23058a235dcc2fe1f6021025dc71f2b618e26104 \
|
||||
--hash=sha256:ee3120ae9dff32f121610bb08e4313be87e03efeadfc6c0d18f89127e24d0c24 \
|
||||
--hash=sha256:eedf4b74eda2b5a4b2b2fb4c006d6295df3bf29e459e198c90ea48e130dc75c3 \
|
||||
--hash=sha256:efd8c21c98c5cc60653bcb311bef2ce0401642b7ce9d09e03a7da87c878289d4 \
|
||||
--hash=sha256:f1c943e96e85df3d3478f7b691f229887e143f81fedab9b20205349ab04d73ed \
|
||||
--hash=sha256:f278f034eb75b4e8a13a54a876cc4a5ab39173d2cdd93a638e1b467fc545ac43 \
|
||||
--hash=sha256:f3f40b3c5a968281fd507d519e444c35f0ff171237f4fdde090dd60699458421 \
|
||||
--hash=sha256:fb043ee2f06b41473269765c2feae53fc2e2fbf96e5e22ca94fb5ad677856f06 \
|
||||
--hash=sha256:fc3d34d4a8fbec3e88a79b92e5465e0f9b842b628675850d860b8bd300b159f5
|
||||
# via onlinejudge
|
||||
psycopg==3.3.4 \
|
||||
--hash=sha256:b6bbc25ccf05c8fad3b061d9db2ef0909a555171b84b07f29458a447253d679a \
|
||||
--hash=sha256:e21207764952cff81b6b8bdacad9a3939f2793367fdac2987b3aac36a651b5bc
|
||||
# via onlinejudge
|
||||
psycopg-binary==3.3.4 \
|
||||
--hash=sha256:018fbed325936da502feb546642c982dcc4b9ffdea32dfef78dbf3b7f7ad4070 \
|
||||
--hash=sha256:136f199a407b5348b9b857c504aff60c77622a28482e7195839ce1b51238c4cc \
|
||||
--hash=sha256:17a21953a9e5ff3a16dab692625a3676e2f101db5e40072f39dbee2250194d68 \
|
||||
--hash=sha256:1dc1f79fd16bb1f3f4421417a514607539f17804d95c7ed617265369d1981cae \
|
||||
--hash=sha256:1fbaa292a3c8bb61b45df1ad3da1908ccee7cb889db9425e3557d9e34e2a4829 \
|
||||
--hash=sha256:26df2717e59c0473e4465a97dfb1b7afebaa479277870fd5784d1436470db47c \
|
||||
--hash=sha256:28b7398fdd19db3232c884fb24550bdfe951221f510e195e233299e4c9b78f97 \
|
||||
--hash=sha256:2c09aad7051326e7603c14e50636db9c01f78272dc54b3accff03d46370461e6 \
|
||||
--hash=sha256:46893c26858be12cc49ca4226ed6a60b4bfccadd946b3bebb783a60b38788228 \
|
||||
--hash=sha256:47c656a8a7ba6eb0cff1801a4caaa9c8bdc12d03080e273aff1c8ac39971a77e \
|
||||
--hash=sha256:494ca54901be8cf9eb7e02c25b731f2317c378efa44f43e8f9bd0e1184ae7be4 \
|
||||
--hash=sha256:514404ed543efd620c85602b747df2a23cf1241b4067199e1a66f2d2757aaa41 \
|
||||
--hash=sha256:580ae30a5f95ccd90008ec697d3ed6a4a2047a516407ad904283fa42086936e9 \
|
||||
--hash=sha256:5ab28a2a7649df3b72e6b674b4c190e448e8e77cf496a65bd846472048de2089 \
|
||||
--hash=sha256:5c4ab71be17bdca30cb34c34c4e1496e2f5d6f20c199c12bad226070b22ef9bf \
|
||||
--hash=sha256:6402a9d8146cf4b3974ded3fd28a971e83dc6a0333eb7822524a3aa20b546578 \
|
||||
--hash=sha256:6b9016b1714da4dd5ecaaa75b82098aa5a0b87854ce9b092e21c27c4ae23e014 \
|
||||
--hash=sha256:71e55ccbdfae79a2ed9c6369c3008a3025817ff9d7e27b32a2d84e2a4267e66e \
|
||||
--hash=sha256:75a9067e236f9b9ae3535b66fe99bddb33d39c0de10112e49b9ab11eee53dc31 \
|
||||
--hash=sha256:773d573e11f437ce0bdb95b7c18dc58390494f96d43f8b45b9760436114f7652 \
|
||||
--hash=sha256:77df19583501ea288eaf15ac0fe7ad01e6d8091a91d5c41df5c718f307d8e31b \
|
||||
--hash=sha256:8c0056529e68dbe9184cd4019a1f3d8f3a4ead2f6fc7a5afcf27d3314edd1277 \
|
||||
--hash=sha256:94596f9e7633ee3f6440711d43bb70aa31cc0a46a900ab8b4201a366ace5c9e7 \
|
||||
--hash=sha256:b56b603ebcea8aa10b46228b8410ba7f13e7c2ee54389d4d9be0927fd8ce2a70 \
|
||||
--hash=sha256:b6f5a29e9c775b9f12a1a717aa7a2c80f9e1db6f27ba44a5b59c80ac61d2ffcf \
|
||||
--hash=sha256:c37e024c07308cd06cf3ec51bfd0e7f6157585a4d84d1bce4a7f5f7913719bf8 \
|
||||
--hash=sha256:c677c4ad433cb7150c8cd304a0769ae3bcfbe5ea0676eb53faa7b1443b16d0d3 \
|
||||
--hash=sha256:dbfdb9b6cc79f31104a7b162a2b921b765fcc62af6c00540a167a8de47e4ed38 \
|
||||
--hash=sha256:df1d567fc430f6df15c9fcf67d87685fc49bdb325adc0db5af1adfb2f44eb5c9 \
|
||||
--hash=sha256:e7510c37550f91a187e3660a8cc50d4b760f8c3b8b2f89ebc5698cd2c7f2c85d \
|
||||
--hash=sha256:eb05ee1c2b817d27c537333224c9e83c7afb86fe7296ba970990068baf819b16 \
|
||||
--hash=sha256:ee17a2cf4943cde261adfad1bbc5bf38d6b3776d7afff74c7cabcbeaeb08c260 \
|
||||
--hash=sha256:fbd1d4ed566895ad2d3bf4ddfd8bae90026930ddf29df3b9d91d32c8c47866a7
|
||||
# via onlinejudge
|
||||
py-ubjson==0.16.1 \
|
||||
--hash=sha256:b9bfb8695a1c7e3632e800fb83c943bf67ed45ddd87cd0344851610c69a5a482
|
||||
# via autobahn
|
||||
pyasn1==0.6.3 \
|
||||
--hash=sha256:697a8ecd6d98891189184ca1fa05d1bb00e2f84b5977c481452050549c8a72cf \
|
||||
--hash=sha256:a80184d120f0864a52a073acc6fc642847d0be408e7c7252f31390c0f4eadcde
|
||||
# via
|
||||
# pyasn1-modules
|
||||
# service-identity
|
||||
pyasn1-modules==0.4.2 \
|
||||
--hash=sha256:29253a9207ce32b64c3ac6600edc75368f98473906e8fd1043bd6b5b1de2c14a \
|
||||
--hash=sha256:677091de870a80aae844b1ca6134f54652fa2c8c5a52aa396440ac3106e941e6
|
||||
# via service-identity
|
||||
pycparser==3.0 ; implementation_name != 'PyPy' \
|
||||
--hash=sha256:600f49d217304a5902ac3c37e1281c9fe94e4d0489de643a9504c5cdfdfc6b29 \
|
||||
--hash=sha256:b727414169a36b7d524c1c3e31839a521725078d7b2ff038656844266160a992
|
||||
# via cffi
|
||||
pydantic==2.13.3 \
|
||||
--hash=sha256:6db14ac8dfc9a1e57f87ea2c0de670c251240f43cb0c30a5130e9720dc612927 \
|
||||
--hash=sha256:af09e9d1d09f4e7fe37145c1f577e1d61ceb9a41924bf0094a36506285d0a84d
|
||||
# via openai
|
||||
pydantic-core==2.46.3 \
|
||||
--hash=sha256:0087084960f209a9a4af50ecd1fb063d9ad3658c07bb81a7a53f452dacbfb2ba \
|
||||
--hash=sha256:031bb17f4885a43773c8c763089499f242aee2ea85cf17154168775dccdecf35 \
|
||||
--hash=sha256:06d5d8820cbbdb4147578c1fe7ffcd5b83f34508cb9f9ab76e807be7db6ff0a4 \
|
||||
--hash=sha256:07bc6d2a28c3adb4f7c6ae46aa4f2d2929af127f587ed44057af50bf1ce0f505 \
|
||||
--hash=sha256:0c9ff69140423eea8ed2d5477df3ba037f671f5e897d206d921bc9fdc39613e7 \
|
||||
--hash=sha256:1105677a6df914b1fb71a81b96c8cce7726857e1717d86001f29be06a25ee6f8 \
|
||||
--hash=sha256:12bc98de041458b80c86c56b24df1d23832f3e166cbaff011f25d187f5c62c37 \
|
||||
--hash=sha256:17eaface65d9fc5abb940003020309c1bf7a211f5f608d7870297c367e6f9022 \
|
||||
--hash=sha256:23cbdb3aaa74dfe0837975dbf69b469753bbde8eacace524519ffdb6b6e89eb7 \
|
||||
--hash=sha256:2798b6ba041b9d70acfb9071a2ea13c8456dd1e6a5555798e41ba7b0790e329c \
|
||||
--hash=sha256:28b5f2ef03416facccb1c6ef744c69793175fd27e44ef15669201601cf423acb \
|
||||
--hash=sha256:28e8cf2f52d72ced402a137145923a762cbb5081e48b34312f7a0c8f55928ec3 \
|
||||
--hash=sha256:28ed528c45446062ee66edb1d33df5d88828ae167de76e773a3c7f64bd14e976 \
|
||||
--hash=sha256:2f40e4246676beb31c5ce77c38a55ca4e465c6b38d11ea1bd935420568e0b1ab \
|
||||
--hash=sha256:367508faa4973b992b271ba1494acaab36eb7e8739d1e47be5035fb1ea225396 \
|
||||
--hash=sha256:3861f1731b90c50a3266316b9044f5c9b405eecb8e299b0a7120596334e4fe9c \
|
||||
--hash=sha256:41c178f65b8c29807239d47e6050262eb6bf84eb695e41101e62e38df4a5bc2c \
|
||||
--hash=sha256:57697d7c056aca4bbb680200f96563e841a6386ac1129370a0102592f4dddff5 \
|
||||
--hash=sha256:5ad3c826fe523e4becf4fe39baa44286cff85ef137c729a2c5e269afbfd0905d \
|
||||
--hash=sha256:5dcbbcf4d22210ced8f837c96db941bdb078f419543472aca5d9a0bb7cddc7df \
|
||||
--hash=sha256:60e5f66e12c4f5212d08522963380eaaeac5ebd795826cfd19b2dfb0c7a52b9c \
|
||||
--hash=sha256:610eda2e3838f401105e6326ca304f5da1e15393ae25dacae5c5c63f2c275b13 \
|
||||
--hash=sha256:68cc7866ed863db34351294187f9b729964c371ba33e31c26f478471c52e1ed0 \
|
||||
--hash=sha256:6e42d83d1c6b87fa56b521479cff237e626a292f3b31b6345c15a99121b454c1 \
|
||||
--hash=sha256:706d9d0ce9cf4593d07270d8e9f53b161f90c57d315aeec4fb4fd7a8b10240d8 \
|
||||
--hash=sha256:75a519dab6d63c514f3a81053e5266c549679e4aa88f6ec57f2b7b854aceb1b0 \
|
||||
--hash=sha256:77706aeb41df6a76568434701e0917da10692da28cb69d5fb6919ce5fdb07374 \
|
||||
--hash=sha256:830d1247d77ad23852314f069e9d7ddafeec5f684baf9d7e7065ed46a049c4e6 \
|
||||
--hash=sha256:85348b8f89d2c3508b65b16c3c33a4da22b8215138d8b996912bb1532868885f \
|
||||
--hash=sha256:87082cd65669a33adeba5470769e9704c7cf026cc30afb9cc77fd865578ebaad \
|
||||
--hash=sha256:8940562319bc621da30714617e6a7eaa6b98c84e8c685bcdc02d7ed5e7c7c44e \
|
||||
--hash=sha256:93fd339f23408a07e98950a89644f92c54d8729719a40b30c0a30bb9ebc55d23 \
|
||||
--hash=sha256:9be3e221bdc6d69abf294dcf7aff6af19c31a5cdcc8f0aa3b14be29df4bd03b1 \
|
||||
--hash=sha256:9ce92e58abc722dac1bf835a6798a60b294e48eb0e625ec9fd994b932ac5feee \
|
||||
--hash=sha256:9d2e32edcc143bc01e95300671915d9ca052d4f745aa0a49c48d4803f8a85f2c \
|
||||
--hash=sha256:a03e6467f0f5ab796a486146d1b887b2dc5e5f9b3288898c1b1c3ad974e53e4a \
|
||||
--hash=sha256:a6cd87cb1575b1ad05ba98894c5b5c96411ef678fa2f6ed2576607095b8d9789 \
|
||||
--hash=sha256:a7610b6a5242a6c736d8ad47fd5fff87fcfe8f833b281b1c409c3d6835d9227f \
|
||||
--hash=sha256:aed19d0c783886d5bd86d80ae5030006b45e28464218747dcf83dabfdd092c7b \
|
||||
--hash=sha256:af8653713055ea18a3abc1537fe2ebc42f5b0bbb768d1eb79fd74eb47c0ac089 \
|
||||
--hash=sha256:afa3aa644f74e290cdede48a7b0bee37d1c35e71b05105f6b340d484af536d9b \
|
||||
--hash=sha256:b11b59b3eee90a80a36701ddb4576d9ae31f93f05cb9e277ceaa09e6bf074a67 \
|
||||
--hash=sha256:b12dd51f1187c2eb489af8e20f880362db98e954b54ab792fa5d92e8bcc6b803 \
|
||||
--hash=sha256:b675ab0a0d5b1c8fdb81195dc5bcefea3f3c240871cdd7ff9a2de8aa50772eb2 \
|
||||
--hash=sha256:b6cdf19bf84128d5e7c37e8a73a0c5c10d51103a650ac585d42dd6ae233f2b7f \
|
||||
--hash=sha256:bcf2a8b2982a6673693eae7348ef3d8cf3979c1d63b54fca7c397a635cc68687 \
|
||||
--hash=sha256:c3212fda0ee959c1dd04c60b601ec31097aaa893573a3a1abd0a47bcac2968c1 \
|
||||
--hash=sha256:ced3310e51aa425f7f77da8bbbb5212616655bedbe82c70944320bc1dbe5e018 \
|
||||
--hash=sha256:cf489cf8986c543939aeee17a09c04d6ffb43bfef8ca16fcbcc5cfdcbed24dba \
|
||||
--hash=sha256:d0793c90c1a3c74966e7975eaef3ed30ebdff3260a0f815a62a22adc17e4c01c \
|
||||
--hash=sha256:d0fe3dce1e836e418f912c1ad91c73357d03e556a4d286f441bf34fed2dbeecf \
|
||||
--hash=sha256:d2d0aead851b66f5245ec0c4fb2612ef457f8bbafefdf65a2bf9d6bac6140f47 \
|
||||
--hash=sha256:e29908922ce9da1a30b4da490bd1d3d82c01dcfdf864d2a74aacee674d0bfa34 \
|
||||
--hash=sha256:ec638c5d194ef8af27db69f16c954a09797c0dc25015ad6123eb2c73a4d271ca \
|
||||
--hash=sha256:ed42e6cc8e1b0e2b9b96e2276bad70ae625d10d6d524aed0c93de974ae029f9f \
|
||||
--hash=sha256:f00a0961b125f1a47af7bcc17f00782e12f4cd056f83416006b30111d941dfa3 \
|
||||
--hash=sha256:f13936129ce841f2a5ddf6f126fea3c43cd128807b5a59588c37cf10178c2e64 \
|
||||
--hash=sha256:f1771ce258afb3e4201e67d154edbbae712a76a6081079fe247c2f53c6322c22 \
|
||||
--hash=sha256:f1f8338dd7a7f31761f1f1a3c47503a9a3b34eea3c8b01fa6ee96408affb5e72 \
|
||||
--hash=sha256:f64b5537ac62b231572879cd08ec05600308636a5d63bcbdb15063a466977bec \
|
||||
--hash=sha256:f80a55484b8d843c8ada81ebf70a682f3f00a3d40e378c06cf17ecb44d280d7d \
|
||||
--hash=sha256:fb528e295ed31570ac3dcc9bfdd6e0150bc11ce6168ac87a8082055cf1a67395 \
|
||||
--hash=sha256:fd35aa21299def8db7ef4fe5c4ff862941a9a158ca7b63d61e66fe67d30416b4 \
|
||||
--hash=sha256:ff5e7783bcc5476e1db448bf268f11cb257b1c276d3e89f00b5727be86dd0127 \
|
||||
--hash=sha256:ffe0883b56cfc05798bf994164d2b2ff03efe2d22022a2bb080f3b626176dd56
|
||||
# via pydantic
|
||||
pyopenssl==26.2.0 \
|
||||
--hash=sha256:4f9d971bc5298b8bc1fab282803da04bf000c755d4ad9d99b52de2569ca19a70 \
|
||||
--hash=sha256:8c6fcecd1183a7fc897548dfe388b0cdb7f37e018200d8409cf33959dbe35387
|
||||
# via twisted
|
||||
python-cas==1.7.2 \
|
||||
--hash=sha256:1c50e0d8e20b0356e571a48e7f987df780eff93a1039ac895aeb0dc78126073e \
|
||||
--hash=sha256:228c540186f52f91605016c3921fee677c214de5454c0b6902956d280c47cadc
|
||||
# via django-cas-ng
|
||||
python-dateutil==2.9.0.post0 \
|
||||
--hash=sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3 \
|
||||
--hash=sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427
|
||||
# via onlinejudge
|
||||
qrcode==8.2 \
|
||||
--hash=sha256:16e64e0716c14960108e85d853062c9e8bba5ca8252c0b4d0231b9df4060ff4f \
|
||||
--hash=sha256:35c3f2a4172b33136ab9f6b3ef1c00260dd2f66f858f24d88418a015f446506c
|
||||
# via onlinejudge
|
||||
redis==7.4.0 \
|
||||
--hash=sha256:64a6ea7bf567ad43c964d2c30d82853f8df927c5c9017766c55a1d1ed95d18ad \
|
||||
--hash=sha256:a9c74a5c893a5ef8455a5adb793a31bb70feb821c86eccb62eebef5a19c429ec
|
||||
# via
|
||||
# channels-redis
|
||||
# django-redis
|
||||
requests==2.33.1 \
|
||||
--hash=sha256:18817f8c57c6263968bc123d237e3b8b08ac046f5456bd1e307ee8f4250d3517 \
|
||||
--hash=sha256:4e6d1ef462f3626a1f0a0a9c42dd93c63bad33f9f1c1937509b8c5c8718ab56a
|
||||
# via python-cas
|
||||
ruff==0.15.12 \
|
||||
--hash=sha256:01da3988d225628b709493d7dc67c3b9b12c0210016b08690ef9bd27970b262b \
|
||||
--hash=sha256:2849ea9f3484c3aca43a82f484210370319e7170df4dfe4843395ddf6c57bc33 \
|
||||
--hash=sha256:83b2f4f2f3b1026b5fb449b467d9264bf22067b600f7b6f41fc5958909f449d0 \
|
||||
--hash=sha256:84a1630093121375a3e2a95b4a6dc7b59e2b4ee76216e32d81aae550a832d002 \
|
||||
--hash=sha256:9ba3b8f1afd7e2e43d8943e55f249e13f9682fde09711644a6e7290eb4f3e339 \
|
||||
--hash=sha256:9cae0f92bd5700d1213188b31cd3bdd2b315361296d10b96b8e2337d3d11f53e \
|
||||
--hash=sha256:9e77c7e51c07fe396826d5969a5b846d9cd4c402535835fb6e21ce8b28fef847 \
|
||||
--hash=sha256:a538f7a82d061cee7be55542aca1d86d1393d55d81d4fcc314370f4340930d4f \
|
||||
--hash=sha256:b0c862b172d695db7598426b8af465e7e9ac00a3ea2a3630ee67eb82e366aaa6 \
|
||||
--hash=sha256:c87a162d61ab3adca47c03f7f717c68672edec7d1b5499e652331780fe74950d \
|
||||
--hash=sha256:d0185894e038d7043ba8fd6aee7499ece6462dc0ea9f1e260c7451807c714c20 \
|
||||
--hash=sha256:dd8aed930da53780d22fc70bdf84452c843cf64f8cb4eb38984319c24c5cd5fd \
|
||||
--hash=sha256:e3bcd123364c3770b8e1b7baaf343cc99a35f197c5c6e8af79015c666c423a6c \
|
||||
--hash=sha256:e852ba9fdc890655e1d78f2df1499efbe0e54126bd405362154a75e2bde159c5 \
|
||||
--hash=sha256:ecea26adb26b4232c0c2ca19ccbc0083a68344180bba2a600605538ce51a40a6 \
|
||||
--hash=sha256:f86f176e188e94d6bdbc09f09bfd9dc729059ad93d0e7390b5a73efe19f8861c \
|
||||
--hash=sha256:fb129f40f114f089ebe0ca56c0d251cf2061b17651d464bb6478dc01e69f11f5 \
|
||||
--hash=sha256:fe87510d000220aa1ed530d4448a7c696a0cae1213e5ec30e5874287b66557b5
|
||||
sentry-sdk==2.59.0 \
|
||||
--hash=sha256:abcf65ee9a9d9cdebf9ad369782408ecca9c1c792686ef06ba34f5ab233527fe \
|
||||
--hash=sha256:cd265808ef8bf3f3edf69b527c0a0b2b6b1322762679e55b8987db2e9584aec1
|
||||
# via onlinejudge
|
||||
service-identity==24.2.0 \
|
||||
--hash=sha256:6b047fbd8a84fd0bb0d55ebce4031e400562b9196e1e0d3e0fe2b8a59f6d4a85 \
|
||||
--hash=sha256:b8683ba13f0d39c6cd5d625d2c5f65421d6d707b013b375c355751557cbe8e09
|
||||
# via twisted
|
||||
six==1.17.0 \
|
||||
--hash=sha256:4721f391ed90541fddacab5acf947aa0d3dc7d27b2e1e8eda2be8970586c3274 \
|
||||
--hash=sha256:ff70335d468e7eb6ec65b95b99d3a2836546063f63acc5171de367e834932a81
|
||||
# via python-dateutil
|
||||
sniffio==1.3.1 \
|
||||
--hash=sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2 \
|
||||
--hash=sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc
|
||||
# via openai
|
||||
sqlparse==0.5.5 \
|
||||
--hash=sha256:12a08b3bf3eec877c519589833aed092e2444e68240a3577e8e26148acc7b1ba \
|
||||
--hash=sha256:e20d4a9b0b8585fdf63b10d30066c7c94c5d7a7ec47c889a2d83a3caa93ff28e
|
||||
# via django
|
||||
tqdm==4.67.3 \
|
||||
--hash=sha256:7d825f03f89244ef73f1d4ce193cb1774a8179fd96f31d7e1dcde62092b960bb \
|
||||
--hash=sha256:ee1e4c0e59148062281c49d80b25b67771a127c85fc9676d3be5f243206826bf
|
||||
# via openai
|
||||
twisted==25.5.0 \
|
||||
--hash=sha256:1deb272358cb6be1e3e8fc6f9c8b36f78eb0fa7c2233d2dbe11ec6fee04ea316 \
|
||||
--hash=sha256:8559f654d01a54a8c3efe66d533d43f383531ebf8d81d9f9ab4769d91ca15df7
|
||||
# via daphne
|
||||
txaio==25.12.2 \
|
||||
--hash=sha256:5f6cd6c6b397fc3305790d15efd46a2d5b91cdbefa96543b4f8666aeb56ba026 \
|
||||
--hash=sha256:9f232c21e12aa1ff52690e365b5a0ecfd42cc27a6ec86e1b92ece88f763f4b78
|
||||
# via autobahn
|
||||
typing-extensions==4.15.0 \
|
||||
--hash=sha256:0cea48d173cc12fa28ecabc3b837ea3cf6f38c6d1136f85cbaaf598984861466 \
|
||||
--hash=sha256:f0fa19c6845758ab08074a0cfa8b7aecb71c999ca73d62883bc25cc018c4e548
|
||||
# via
|
||||
# anyio
|
||||
# openai
|
||||
# psycopg
|
||||
# pydantic
|
||||
# pydantic-core
|
||||
# pyopenssl
|
||||
# twisted
|
||||
# typing-inspection
|
||||
typing-inspection==0.4.2 \
|
||||
--hash=sha256:4ed1cacbdc298c220f1bd249ed5287caa16f34d44ef4e9c3d0cbad5b521545e7 \
|
||||
--hash=sha256:ba561c48a67c5958007083d386c3295464928b01faa735ab8547c5692e87f464
|
||||
# via pydantic
|
||||
tzdata==2026.2 ; sys_platform == 'win32' \
|
||||
--hash=sha256:9173fde7d80d9018e02a662e168e5a2d04f87c41ea174b139fbef642eda62d10 \
|
||||
--hash=sha256:bbe9af844f658da81a5f95019480da3a89415801f6cc966806612cc7169bffe7
|
||||
# via
|
||||
# django
|
||||
# psycopg
|
||||
u-msgpack-python==2.8.0 ; platform_python_implementation != 'CPython' \
|
||||
--hash=sha256:1d853d33e78b72c4228a2025b4db28cda81214076e5b0422ed0ae1b1b2bb586a \
|
||||
--hash=sha256:b801a83d6ed75e6df41e44518b4f2a9c221dc2da4bcd5380e3a0feda520bc61a
|
||||
# via autobahn
|
||||
ujson==5.12.0 \
|
||||
--hash=sha256:02f93da7a4115e24f886b04fd56df1ee8741c2ce4ea491b7ab3152f744ad8f8e \
|
||||
--hash=sha256:085b6ce182cdd6657481c7c4003a417e0655c4f6e58b76f26ee18f0ae21db827 \
|
||||
--hash=sha256:09b4beff9cc91d445d5818632907b85fb06943b61cb346919ce202668bf6794a \
|
||||
--hash=sha256:0a3ae28f0b209be5af50b54ca3e2123a3de3a57d87b75f1e5aa3d7961e041983 \
|
||||
--hash=sha256:0d2e8db5ade3736a163906154ca686203acc7d1d30736cbf577c730d13653d84 \
|
||||
--hash=sha256:0e00cec383eab2406c9e006bd4edb55d284e94bb943fda558326048178d26961 \
|
||||
--hash=sha256:14b2e1eb528d77bc0f4c5bd1a7ebc05e02b5b41beefb7e8567c9675b8b13bcf4 \
|
||||
--hash=sha256:15d416440148f3e56b9b244fdaf8a09fcf5a72e4944b8e119f5bf60417a2bfc8 \
|
||||
--hash=sha256:15e555c4caca42411270b2ed2b2ebc7b3a42bb04138cef6c956e1f1d49709fe2 \
|
||||
--hash=sha256:16b4fe9c97dc605f5e1887a9e1224287291e35c56cbc379f8aa44b6b7bcfe2bb \
|
||||
--hash=sha256:1b5c6ceb65fecd28a1d20d1eba9dbfa992612b86594e4b6d47bb580d2dd6bcb3 \
|
||||
--hash=sha256:2324d9a0502317ffc35d38e153c1b2fa9610ae03775c9d0f8d0cca7b8572b04e \
|
||||
--hash=sha256:2a248750abce1c76fbd11b2e1d88b95401e72819295c3b851ec73399d6849b3d \
|
||||
--hash=sha256:2ea6206043385343aff0b7da65cf73677f6f5e50de8f1c879e557f4298cac36a \
|
||||
--hash=sha256:31348a0ffbfc815ce78daac569d893349d85a0b57e1cd2cdbba50b7f333784da \
|
||||
--hash=sha256:3c2f947e55d3c7cfe124dd4521ee481516f3007d13c6ad4bf6aeb722e190eb1b \
|
||||
--hash=sha256:3ff4ede90ed771140caa7e1890de17431763a483c54b3c1f88bd30f0cc1affc0 \
|
||||
--hash=sha256:42d875388fbd091c7ea01edfff260f839ba303038ffb23475ef392012e4d63dd \
|
||||
--hash=sha256:50524f4f6a1c839714dbaff5386a1afb245d2d5ec8213a01fbc99cea7307811e \
|
||||
--hash=sha256:64df53eef4ac857eb5816a56e2885ccf0d7dff6333c94065c93b39c51063e01d \
|
||||
--hash=sha256:6879aed770557f0961b252648d36f6fdaab41079d37a2296b5649fd1b35608e0 \
|
||||
--hash=sha256:6ad57654570464eb1b040b5c353dee442608e06cff9102b8fcb105565a44c9ed \
|
||||
--hash=sha256:6c0aed6a4439994c9666fb8a5b6c4eac94d4ef6ddc95f9b806a599ef83547e3b \
|
||||
--hash=sha256:76bf3e7406cf23a3e1ca6a23fb1fb9ea82f4f6bd226fe226e09146b0194f85dc \
|
||||
--hash=sha256:7bbf05c38debc90d1a195b11340cc85cb43ab3e753dc47558a3a84a38cbc72da \
|
||||
--hash=sha256:7ddb08b3c2f9213df1f2e3eb2fbea4963d80ec0f8de21f0b59898e34f3b3d96d \
|
||||
--hash=sha256:7e07f6f644d2c44d53b7a320a084eef98063651912c1b9449b5f45fcbdc6ccd2 \
|
||||
--hash=sha256:85833bca01aa5cae326ac759276dc175c5fa3f7b3733b7d543cf27f2df12d1ef \
|
||||
--hash=sha256:8712b61eb1b74a4478cfd1c54f576056199e9f093659334aeb5c4a6b385338e5 \
|
||||
--hash=sha256:937794042342006f707837f38d721426b11b0774d327a2a45c0bd389eb750a87 \
|
||||
--hash=sha256:93bc91fdadcf046da37a214eaa714574e7e9b1913568e93bb09527b2ceb7f759 \
|
||||
--hash=sha256:94c5f1621cbcab83c03be46441f090b68b9f307b6c7ec44d4e3f6d5997383df4 \
|
||||
--hash=sha256:99cc80facad240b0c2fb5a633044420878aac87a8e7c348b9486450cba93f27c \
|
||||
--hash=sha256:9a5fcbe7b949f2e95c47ea8a80b410fcdf2da61c98553b45a4ee875580418b68 \
|
||||
--hash=sha256:a2d79c6635ccffcbfc1d5c045874ba36b594589be81d50d43472570bb8de9c57 \
|
||||
--hash=sha256:a7bf9cc97f05048ac8f3e02cd58f0fe62b901453c24345bfde287f4305dcc31c \
|
||||
--hash=sha256:bacbd3c69862478cbe1c7ed4325caedec580d8acf31b8ee1b9a1e02a56295cad \
|
||||
--hash=sha256:bb349dbba57c76eec25e5917e07f35aabaf0a33b9e67fc13d188002500106487 \
|
||||
--hash=sha256:bd03472c36fa3a386a6deb887113b9e3fa40efba8203eb4fe786d3c0ccc724f6 \
|
||||
--hash=sha256:bf85a00ac3b56a1e7a19c5be7b02b5180a0895ac4d3c234d717a55e86960691c \
|
||||
--hash=sha256:ca0c7ce828bb76ab78b3991904b477c2fd0f711d7815c252d1ef28ff9450b052 \
|
||||
--hash=sha256:ccbfd94e59aad4a2566c71912b55f0547ac1680bfac25eb138e6703eb3dd434e \
|
||||
--hash=sha256:d1831c07bd4dce53c4b666fa846c7eba4b7c414f2e641a4585b7f50b72f502dc \
|
||||
--hash=sha256:d22cad98c2a10bbf6aa083a8980db6ed90d4285a841c4de892890c2b28286ef9 \
|
||||
--hash=sha256:d30ad4359413c8821cc7b3707f7ca38aa8bc852ba3b9c5a759ee2d7740157315 \
|
||||
--hash=sha256:e0dd3676ea0837cd70ea1879765e9e9f6be063be0436de9b3ea4b775caf83654 \
|
||||
--hash=sha256:e6369ac293d2cc40d52577e4fa3d75a70c1aae2d01fa3580a34a4e6eff9286b9 \
|
||||
--hash=sha256:efae5df7a8cc8bdb1037b0f786b044ce281081441df5418c3a0f0e1f86fe7bb3 \
|
||||
--hash=sha256:f19b3af31d02a2e79c5f9a6deaab0fb3c116456aeb9277d11720ad433de6dfc6 \
|
||||
--hash=sha256:f7a0430d765f9bda043e6aefaba5944d5f21ec43ff4774417d7e296f61917382
|
||||
# via autobahn
|
||||
urllib3==2.6.3 \
|
||||
--hash=sha256:1b62b6884944a57dbe321509ab94fd4d3b307075e0c2eae991ac71ee15ad38ed \
|
||||
--hash=sha256:bf272323e553dfb2e87d9bfd225ca7b0f467b919d7bbd355436d3fd37cb0acd4
|
||||
# via
|
||||
# requests
|
||||
# sentry-sdk
|
||||
xlsxwriter==3.2.9 \
|
||||
--hash=sha256:254b1c37a368c444eac6e2f867405cc9e461b0ed97a3233b2ac1e574efb4140c \
|
||||
--hash=sha256:9a5db42bc5dff014806c58a20b9eae7322a134abb6fce3c92c181bfb275ec5b3
|
||||
# via onlinejudge
|
||||
zope-interface==8.4 \
|
||||
--hash=sha256:049ba3c7b38cc400ae08e011617635706e0f442e1d075db1b015246fcbf6091e \
|
||||
--hash=sha256:0d88c1f106a4f06e074a3ada2d20f4a602e3f2871c4f55726ed5d91e94ec19b1 \
|
||||
--hash=sha256:29f09ec8bda65f7b30294328070070a2590b90f252f834ee0817cdb0e2c35f6a \
|
||||
--hash=sha256:2bc388cebcb753d21eaf2a0481fd6f0ce6840a47300a40dcec0b56bac27d0f97 \
|
||||
--hash=sha256:2e9e4aa33b76877af903d5532545e64d24ade0f6f80d9d1a31e6efcea76a60bc \
|
||||
--hash=sha256:36c575356732d59ffd3279ad67e302a6fe517e67db5b061b36b377ee0fa016c4 \
|
||||
--hash=sha256:3e5866917ccb57d929e515a1136d729bd3fa4f367965fb16e38a4bc72cb05521 \
|
||||
--hash=sha256:4713bf651ec36e7eea49d2ace4f0e89bec2b33a339674874b1121f2537edc62a \
|
||||
--hash=sha256:4ae6a1e111642dbf724f635424dcaf5a5c8abbde49eac3f452f5323ffaa10232 \
|
||||
--hash=sha256:7cbb887fdbfaacb4c362dbb487033551646e28013ad5ffe72e96eb260003a1a1 \
|
||||
--hash=sha256:81ed23698bfb588c48b1756129814b890febac971ff6c8a414f82601773145bb \
|
||||
--hash=sha256:84064876ed96ddd0744e3ad5d37134c758d77885e54113567792671405a02bac \
|
||||
--hash=sha256:8b302f955c36e924e1f4fe70dd9105ff06235857861c6ae72c3b10b016aeee99 \
|
||||
--hash=sha256:9c4ac009c2c8e43283842f80387c4d4b41bcbc293391c3b9ab71532ae1ccc301 \
|
||||
--hash=sha256:9dbee7925a23aa6349738892c911019d4095a96cff487b743482073ecbc174a8 \
|
||||
--hash=sha256:a5638c6be715116d3453e6d099c299c6844d54810de7445ce116424e905ede06 \
|
||||
--hash=sha256:b8147b40bfcd53803870a9519e0879ff066aeecc2fcff8295663c1b17fc38dc2 \
|
||||
--hash=sha256:cd55965d715413038774aead54851bc3dbdd74a69f3ce30252182a94407b9905 \
|
||||
--hash=sha256:d934497c4b72d5f528d2b5ebe9b8b5a7004b5877948ebd4ea00c2432fb27178f \
|
||||
--hash=sha256:e0b9d7e958657fad414f8272afcdf0b8a873fbbb2bb6a6287232d2f11a232bf8 \
|
||||
--hash=sha256:eef0a49e041f4dc4d2a6ab894b4fd0c5354e0e8037e731fb953531e59b0d3d33 \
|
||||
--hash=sha256:f1f854bef8bc137519e4413bcc1322d55faad28b20b3ca39f7bec49d2f1b26df
|
||||
# via twisted
|
||||
asgiref==3.8.1
|
||||
certifi==2025.6.15
|
||||
charset-normalizer==3.4.2
|
||||
click==8.2.1
|
||||
django==5.2.3
|
||||
django-dbconn-retry==0.1.8
|
||||
django-dramatiq==0.13.0
|
||||
django-redis==5.4.0
|
||||
djangorestframework==3.16.0
|
||||
dramatiq==1.18.0
|
||||
envelopes==0.4
|
||||
gunicorn==23.0.0
|
||||
h11==0.16.0
|
||||
idna==3.10
|
||||
otpauth==2.2.1
|
||||
packaging==25.0
|
||||
pillow==11.2.1
|
||||
prometheus-client==0.22.1
|
||||
psycopg==3.2.9
|
||||
psycopg-binary==3.2.9
|
||||
python-dateutil==2.9.0.post0
|
||||
qrcode==8.2
|
||||
raven==6.10.0
|
||||
redis==6.2.0
|
||||
requests==2.32.4
|
||||
six==1.17.0
|
||||
sqlparse==0.5.3
|
||||
typing-extensions==4.14.0
|
||||
urllib3==2.4.0
|
||||
uvicorn==0.35.0
|
||||
xlsxwriter==3.2.5
|
||||
|
||||
@@ -28,7 +28,7 @@ stopwaitsecs = 5
|
||||
killasgroup=true
|
||||
|
||||
[program:gunicorn]
|
||||
command=gunicorn oj.wsgi --user server --group spj --bind 127.0.0.1:8080 --workers %(ENV_MAX_WORKER_NUM)s --threads 4 --max-requests-jitter 10000 --max-requests 1000000 --keep-alive 32
|
||||
command=gunicorn oj.asgi --user server --group spj --bind 127.0.0.1:8080 --workers %(ENV_MAX_WORKER_NUM)s --threads 4 --max-requests-jitter 10000 --max-requests 1000000 --keep-alive 32 --worker-class uvicorn.workers.UvicornWorker
|
||||
directory=/app/
|
||||
stdout_logfile=/data/log/gunicorn.log
|
||||
stderr_logfile=/data/log/gunicorn.log
|
||||
@@ -38,18 +38,6 @@ startsecs=5
|
||||
stopwaitsecs = 5
|
||||
killasgroup=true
|
||||
|
||||
[program:daphne]
|
||||
command=daphne -b 127.0.0.1 -p 8001 --access-log /data/log/daphne_access.log oj.asgi:application
|
||||
directory=/app/
|
||||
user=server
|
||||
stdout_logfile=/data/log/daphne.log
|
||||
stderr_logfile=/data/log/daphne.log
|
||||
autostart=true
|
||||
autorestart=true
|
||||
startsecs=5
|
||||
stopwaitsecs = 5
|
||||
killasgroup=true
|
||||
|
||||
[program:dramatiq]
|
||||
command=python3 manage.py rundramatiq --processes %(ENV_MAX_WORKER_NUM)s --threads 4
|
||||
directory=/app/
|
||||
|
||||
173
dev.py
173
dev.py
@@ -1,173 +0,0 @@
|
||||
#!/usr/bin/env python
|
||||
"""
|
||||
WebSocket 开发服务器启动脚本
|
||||
同时启动 Daphne (WebSocket) 和 Django runserver (开发服务器)
|
||||
支持 Windows 和 Linux
|
||||
"""
|
||||
|
||||
import os
|
||||
import platform
|
||||
import signal
|
||||
import subprocess
|
||||
import sys
|
||||
import time
|
||||
from pathlib import Path
|
||||
from threading import Thread
|
||||
|
||||
|
||||
def main():
|
||||
# 获取项目根目录
|
||||
base_dir = Path(__file__).resolve().parent
|
||||
os.chdir(base_dir)
|
||||
|
||||
print("=" * 70)
|
||||
print("启动 Django 开发服务器 + WebSocket 服务器")
|
||||
print("=" * 70)
|
||||
print()
|
||||
|
||||
# 检测操作系统
|
||||
is_windows = platform.system() == "Windows"
|
||||
|
||||
# 检查虚拟环境(跨平台)
|
||||
if is_windows:
|
||||
# Windows: .venv/Scripts/python.exe
|
||||
venv_python = base_dir / ".venv" / "Scripts" / "python.exe"
|
||||
else:
|
||||
# Linux/Mac: .venv/bin/python
|
||||
venv_python = base_dir / ".venv" / "bin" / "python"
|
||||
|
||||
if venv_python.exists():
|
||||
print("[✓] 使用虚拟环境: .venv")
|
||||
python_exec = str(venv_python)
|
||||
else:
|
||||
print("[!] 未找到 .venv 虚拟环境,使用全局 Python")
|
||||
print("[!] 建议创建虚拟环境: python -m venv .venv")
|
||||
python_exec = sys.executable
|
||||
|
||||
# 检查 daphne 是否安装
|
||||
try:
|
||||
result = subprocess.run(
|
||||
[python_exec, "-m", "daphne", "--version"], capture_output=True, text=True
|
||||
)
|
||||
if result.returncode != 0 and result.returncode != 2:
|
||||
print("[✗] 错误: Daphne 未安装")
|
||||
print("请运行: pip install daphne channels channels-redis")
|
||||
sys.exit(1)
|
||||
except FileNotFoundError:
|
||||
print("[✗] 错误: 无法找到 Python 解释器")
|
||||
sys.exit(1)
|
||||
# 进程列表
|
||||
processes = []
|
||||
|
||||
# 启动两个服务器
|
||||
try:
|
||||
# 启动 Django runserver (端口 8000)
|
||||
print("[*] 启动 Django 开发服务器 (端口 8000)...")
|
||||
runserver_cmd = ["uv", "run", "manage.py", "runserver", "0.0.0.0:8000"]
|
||||
runserver_process = subprocess.Popen(
|
||||
runserver_cmd,
|
||||
stdout=subprocess.PIPE,
|
||||
stderr=subprocess.STDOUT,
|
||||
text=True,
|
||||
bufsize=1,
|
||||
universal_newlines=True,
|
||||
)
|
||||
processes.append(("Django Runserver", runserver_process))
|
||||
|
||||
# 等待一下,让 runserver 先启动
|
||||
time.sleep(1)
|
||||
|
||||
# 启动 Daphne (端口 8001)
|
||||
print("[*] 启动 Daphne WebSocket 服务器 (端口 8001)...")
|
||||
daphne_cmd = [
|
||||
python_exec,
|
||||
"-m",
|
||||
"daphne",
|
||||
"-b",
|
||||
"0.0.0.0",
|
||||
"-p",
|
||||
"8001",
|
||||
"oj.asgi:application",
|
||||
]
|
||||
daphne_process = subprocess.Popen(
|
||||
daphne_cmd,
|
||||
stdout=subprocess.PIPE,
|
||||
stderr=subprocess.STDOUT,
|
||||
text=True,
|
||||
bufsize=1,
|
||||
universal_newlines=True,
|
||||
)
|
||||
processes.append(("Daphne", daphne_process))
|
||||
|
||||
print()
|
||||
print("[✓] 所有服务器已启动")
|
||||
print()
|
||||
|
||||
# 创建输出线程
|
||||
def print_output(name, process):
|
||||
"""打印进程输出"""
|
||||
for line in process.stdout:
|
||||
print(f"[{name}] {line}", end="")
|
||||
|
||||
# 启动输出线程
|
||||
threads = []
|
||||
for name, process in processes:
|
||||
thread = Thread(target=print_output, args=(name, process), daemon=True)
|
||||
thread.start()
|
||||
threads.append(thread)
|
||||
|
||||
# 等待进程(任意一个退出就退出)
|
||||
while True:
|
||||
for name, process in processes:
|
||||
if process.poll() is not None:
|
||||
print(f"\n[!] {name} 已退出")
|
||||
raise KeyboardInterrupt
|
||||
time.sleep(0.5)
|
||||
|
||||
except KeyboardInterrupt:
|
||||
print()
|
||||
print()
|
||||
print("[*] 正在停止所有服务器...")
|
||||
|
||||
# 终止所有进程
|
||||
for name, process in processes:
|
||||
try:
|
||||
if process.poll() is None: # 如果进程还在运行
|
||||
print(f"[*] 停止 {name}...")
|
||||
if is_windows:
|
||||
# Windows 使用 CTRL_C_EVENT
|
||||
process.send_signal(signal.CTRL_C_EVENT)
|
||||
else:
|
||||
# Unix 使用 SIGTERM
|
||||
process.terminate()
|
||||
|
||||
# 等待进程结束(最多 5 秒)
|
||||
try:
|
||||
process.wait(timeout=5)
|
||||
except subprocess.TimeoutExpired:
|
||||
print(f"[!] {name} 未响应,强制终止...")
|
||||
process.kill()
|
||||
process.wait()
|
||||
except Exception as e:
|
||||
print(f"[!] 停止 {name} 时出错: {e}")
|
||||
|
||||
print()
|
||||
print("[✓] 所有服务器已停止")
|
||||
|
||||
except Exception as e:
|
||||
print(f"[✗] 错误: {e}")
|
||||
|
||||
# 清理所有进程
|
||||
for name, process in processes:
|
||||
try:
|
||||
if process.poll() is None:
|
||||
process.kill()
|
||||
process.wait()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
@@ -1,7 +0,0 @@
|
||||
from django.apps import AppConfig
|
||||
|
||||
|
||||
class FlowchartConfig(AppConfig):
|
||||
default_auto_field = 'django.db.models.BigAutoField'
|
||||
name = 'flowchart'
|
||||
verbose_name = '流程图管理'
|
||||
@@ -1,84 +0,0 @@
|
||||
"""
|
||||
WebSocket consumers for flowchart evaluation updates
|
||||
"""
|
||||
import json
|
||||
import logging
|
||||
|
||||
from channels.generic.websocket import AsyncWebsocketConsumer
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class FlowchartConsumer(AsyncWebsocketConsumer):
|
||||
"""
|
||||
WebSocket consumer for real-time flowchart evaluation updates
|
||||
当用户提交流程图后,通过 WebSocket 实时接收AI评分状态更新
|
||||
"""
|
||||
|
||||
async def connect(self):
|
||||
"""处理 WebSocket 连接"""
|
||||
self.user = self.scope["user"]
|
||||
|
||||
# 只允许认证用户连接
|
||||
if not self.user.is_authenticated:
|
||||
await self.close()
|
||||
return
|
||||
|
||||
# 使用用户 ID 作为组名,这样可以向特定用户推送消息
|
||||
self.group_name = f"flowchart_user_{self.user.id}"
|
||||
|
||||
# 加入用户专属的组
|
||||
await self.channel_layer.group_add(
|
||||
self.group_name,
|
||||
self.channel_name
|
||||
)
|
||||
|
||||
await self.accept()
|
||||
logger.info(f"Flowchart WebSocket connected: user_id={self.user.id}, channel={self.channel_name}")
|
||||
|
||||
async def disconnect(self, close_code):
|
||||
"""处理 WebSocket 断开连接"""
|
||||
if hasattr(self, 'group_name'):
|
||||
await self.channel_layer.group_discard(
|
||||
self.group_name,
|
||||
self.channel_name
|
||||
)
|
||||
logger.info(f"Flowchart WebSocket disconnected: user_id={self.user.id}, close_code={close_code}")
|
||||
|
||||
async def receive(self, text_data):
|
||||
"""
|
||||
接收客户端消息
|
||||
客户端可以发送心跳包或订阅特定流程图提交
|
||||
"""
|
||||
try:
|
||||
data = json.loads(text_data)
|
||||
message_type = data.get("type")
|
||||
|
||||
if message_type == "ping":
|
||||
# 响应心跳包
|
||||
await self.send(text_data=json.dumps({
|
||||
"type": "pong",
|
||||
"timestamp": data.get("timestamp")
|
||||
}))
|
||||
elif message_type == "subscribe":
|
||||
# 订阅特定流程图提交的更新
|
||||
submission_id = data.get("submission_id")
|
||||
if submission_id:
|
||||
logger.info(f"User {self.user.id} subscribed to flowchart submission {submission_id}")
|
||||
# 可以在这里做额外的订阅逻辑
|
||||
except json.JSONDecodeError:
|
||||
logger.error(f"Invalid JSON received from user {self.user.id}")
|
||||
except Exception as e:
|
||||
logger.error(f"Error handling message from user {self.user.id}: {str(e)}")
|
||||
|
||||
async def flowchart_evaluation_update(self, event):
|
||||
"""
|
||||
接收来自 channel layer 的流程图评分更新消息并发送给客户端
|
||||
这个方法名对应 push_flowchart_evaluation_update 中的 type 字段
|
||||
"""
|
||||
try:
|
||||
# 从 event 中提取数据并发送给客户端
|
||||
await self.send(text_data=json.dumps(event["data"]))
|
||||
logger.debug(f"Sent flowchart evaluation update to user {self.user.id}: {event['data']}")
|
||||
except Exception as e:
|
||||
logger.error(f"Error sending flowchart evaluation update to user {self.user.id}: {str(e)}")
|
||||
@@ -1,45 +0,0 @@
|
||||
# Generated by Django 5.2.3 on 2025-10-11 14:57
|
||||
|
||||
import django.db.models.deletion
|
||||
import utils.shortcuts
|
||||
from django.conf import settings
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
initial = True
|
||||
|
||||
dependencies = [
|
||||
('problem', '0004_problem_allow_flowchart_problem_flowchart_data_and_more'),
|
||||
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.CreateModel(
|
||||
name='FlowchartSubmission',
|
||||
fields=[
|
||||
('id', models.TextField(db_index=True, default=utils.shortcuts.rand_str, primary_key=True, serialize=False)),
|
||||
('mermaid_code', models.TextField()),
|
||||
('flowchart_data', models.JSONField(default=dict)),
|
||||
('status', models.IntegerField(default=0)),
|
||||
('create_time', models.DateTimeField(auto_now_add=True)),
|
||||
('ai_score', models.FloatField(blank=True, null=True)),
|
||||
('ai_grade', models.CharField(blank=True, max_length=10, null=True)),
|
||||
('ai_feedback', models.TextField(blank=True, null=True)),
|
||||
('ai_suggestions', models.TextField(blank=True, null=True)),
|
||||
('ai_criteria_details', models.JSONField(default=dict)),
|
||||
('ai_provider', models.CharField(default='deepseek', max_length=50)),
|
||||
('ai_model', models.CharField(default='deepseek-chat', max_length=50)),
|
||||
('processing_time', models.FloatField(blank=True, null=True)),
|
||||
('evaluation_time', models.DateTimeField(blank=True, null=True)),
|
||||
('problem', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='flowchart_submissions', to='problem.problem')),
|
||||
('user', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='flowchart_submissions', to=settings.AUTH_USER_MODEL)),
|
||||
],
|
||||
options={
|
||||
'db_table': 'flowchart_submission',
|
||||
'ordering': ['-create_time'],
|
||||
'indexes': [models.Index(fields=['user', 'create_time'], name='flowchart_user_time_idx'), models.Index(fields=['problem', 'create_time'], name='flowchart_problem_time_idx'), models.Index(fields=['status'], name='flowchart_status_idx')],
|
||||
},
|
||||
),
|
||||
]
|
||||
@@ -1,18 +0,0 @@
|
||||
# Generated by Django 6.0 on 2026-04-27 12:31
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('flowchart', '0001_initial'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name='flowchartsubmission',
|
||||
name='ai_model',
|
||||
field=models.CharField(default='deepseek-v4-flash', max_length=50),
|
||||
),
|
||||
]
|
||||
@@ -1,66 +0,0 @@
|
||||
from django.contrib.auth import get_user_model
|
||||
from django.db import models
|
||||
|
||||
from problem.models import Problem
|
||||
from utils.shortcuts import rand_str
|
||||
|
||||
User = get_user_model()
|
||||
|
||||
class FlowchartSubmissionStatus:
|
||||
PENDING = 0 # 等待AI评分
|
||||
PROCESSING = 1 # AI评分中
|
||||
COMPLETED = 2 # 评分完成
|
||||
FAILED = 3 # 评分失败
|
||||
|
||||
class FlowchartSubmission(models.Model):
|
||||
"""流程图提交模型"""
|
||||
id = models.TextField(default=rand_str, primary_key=True, db_index=True)
|
||||
|
||||
# 基础信息
|
||||
user = models.ForeignKey(User, on_delete=models.CASCADE, related_name='flowchart_submissions')
|
||||
problem = models.ForeignKey(Problem, on_delete=models.CASCADE, related_name='flowchart_submissions')
|
||||
|
||||
# 提交内容
|
||||
mermaid_code = models.TextField() # Mermaid代码
|
||||
flowchart_data = models.JSONField(default=dict) # 流程图元数据
|
||||
|
||||
# 状态信息
|
||||
status = models.IntegerField(default=FlowchartSubmissionStatus.PENDING)
|
||||
create_time = models.DateTimeField(auto_now_add=True)
|
||||
|
||||
# AI评分结果
|
||||
ai_score = models.FloatField(null=True, blank=True) # AI评分 (0-100)
|
||||
ai_grade = models.CharField(max_length=10, null=True, blank=True) # 等级 (S/A/B/C)
|
||||
ai_feedback = models.TextField(null=True, blank=True) # AI反馈
|
||||
ai_suggestions = models.TextField(null=True, blank=True) # AI建议
|
||||
ai_criteria_details = models.JSONField(default=dict) # 详细评分标准
|
||||
|
||||
# 处理信息
|
||||
ai_provider = models.CharField(max_length=50, default='deepseek')
|
||||
ai_model = models.CharField(max_length=50, default='deepseek-v4-flash')
|
||||
processing_time = models.FloatField(null=True, blank=True) # AI处理耗时(秒)
|
||||
evaluation_time = models.DateTimeField(null=True, blank=True) # 评分完成时间
|
||||
|
||||
|
||||
class Meta:
|
||||
db_table = 'flowchart_submission'
|
||||
ordering = ['-create_time']
|
||||
indexes = [
|
||||
models.Index(fields=['user', 'create_time'], name='flowchart_user_time_idx'),
|
||||
models.Index(fields=['problem', 'create_time'], name='flowchart_problem_time_idx'),
|
||||
models.Index(fields=['status'], name='flowchart_status_idx'),
|
||||
]
|
||||
|
||||
def __str__(self):
|
||||
return f"FlowchartSubmission {self.id}"
|
||||
|
||||
def check_user_permission(self, user, check_share=True):
|
||||
"""检查用户权限"""
|
||||
if (
|
||||
self.user_id == user.id
|
||||
or not user.is_regular_user()
|
||||
or self.problem.created_by_id == user.id
|
||||
):
|
||||
return True
|
||||
|
||||
return False
|
||||
@@ -1,98 +0,0 @@
|
||||
from rest_framework import serializers
|
||||
|
||||
from .models import FlowchartSubmission
|
||||
|
||||
|
||||
class CreateFlowchartSubmissionSerializer(serializers.Serializer):
|
||||
problem_id = serializers.IntegerField()
|
||||
mermaid_code = serializers.CharField(max_length=50000)
|
||||
flowchart_data = serializers.JSONField(required=False, default=dict)
|
||||
|
||||
def validate_mermaid_code(self, value):
|
||||
if not value.strip():
|
||||
raise serializers.ValidationError("Mermaid代码不能为空")
|
||||
return value
|
||||
|
||||
def validate_flowchart_data(self, value):
|
||||
import json
|
||||
if len(json.dumps(value)) > 500 * 1024:
|
||||
raise serializers.ValidationError("流程图数据过大")
|
||||
return value
|
||||
|
||||
|
||||
class FlowchartSubmissionSerializer(serializers.ModelSerializer):
|
||||
class Meta:
|
||||
model = FlowchartSubmission
|
||||
fields = [
|
||||
"id",
|
||||
"user",
|
||||
"problem",
|
||||
"mermaid_code",
|
||||
"flowchart_data",
|
||||
"status",
|
||||
"create_time",
|
||||
"ai_score",
|
||||
"ai_grade",
|
||||
"ai_feedback",
|
||||
"ai_suggestions",
|
||||
"ai_criteria_details",
|
||||
"ai_provider",
|
||||
"ai_model",
|
||||
"processing_time",
|
||||
"evaluation_time",
|
||||
]
|
||||
read_only_fields = ["id", "create_time", "evaluation_time"]
|
||||
|
||||
|
||||
class FlowchartSubmissionListSerializer(serializers.ModelSerializer):
|
||||
"""用于列表显示的简化序列化器"""
|
||||
|
||||
username = serializers.CharField(source="user.username")
|
||||
problem = serializers.CharField(source="problem._id")
|
||||
problem_title = serializers.CharField(source="problem.title")
|
||||
class Meta:
|
||||
model = FlowchartSubmission
|
||||
fields = [
|
||||
"id",
|
||||
"username",
|
||||
"problem_title",
|
||||
"problem",
|
||||
"status",
|
||||
"create_time",
|
||||
"ai_score",
|
||||
"ai_grade",
|
||||
"ai_provider",
|
||||
"ai_model",
|
||||
"processing_time",
|
||||
"evaluation_time",
|
||||
]
|
||||
|
||||
|
||||
class FlowchartSubmissionSummarySerializer(serializers.ModelSerializer):
|
||||
"""用于AI详情页面的极简序列化器,只包含必要字段"""
|
||||
|
||||
problem_title = serializers.CharField(source="problem.title")
|
||||
problem__id = serializers.CharField(source="problem._id")
|
||||
|
||||
class Meta:
|
||||
model = FlowchartSubmission
|
||||
fields = [
|
||||
"id",
|
||||
"problem__id",
|
||||
"problem_title",
|
||||
"ai_score",
|
||||
"ai_grade",
|
||||
"create_time",
|
||||
]
|
||||
|
||||
|
||||
class FlowchartSubmissionMergedSerializer(serializers.Serializer):
|
||||
"""合并后的流程图提交序列化器"""
|
||||
|
||||
problem__id = serializers.CharField()
|
||||
problem_title = serializers.CharField()
|
||||
submission_count = serializers.IntegerField()
|
||||
best_score = serializers.FloatField()
|
||||
best_grade = serializers.CharField()
|
||||
latest_submission_time = serializers.DateTimeField()
|
||||
avg_score = serializers.FloatField()
|
||||
@@ -1,171 +0,0 @@
|
||||
import json
|
||||
import time
|
||||
|
||||
import dramatiq
|
||||
from django.db import transaction
|
||||
from django.utils import timezone
|
||||
|
||||
from utils.openai import get_ai_client
|
||||
from utils.shortcuts import DRAMATIQ_WORKER_ARGS
|
||||
|
||||
from .models import FlowchartSubmission, FlowchartSubmissionStatus
|
||||
|
||||
|
||||
@dramatiq.actor(**DRAMATIQ_WORKER_ARGS(max_retries=3))
|
||||
def evaluate_flowchart_task(submission_id):
|
||||
"""异步AI评分任务"""
|
||||
try:
|
||||
submission = FlowchartSubmission.objects.get(id=submission_id)
|
||||
|
||||
# 更新状态为处理中
|
||||
submission.status = FlowchartSubmissionStatus.PROCESSING
|
||||
submission.save()
|
||||
|
||||
start_time = time.time()
|
||||
|
||||
# 使用固定评分标准
|
||||
system_prompt = build_evaluation_prompt(submission.problem)
|
||||
|
||||
# 构建用户提示词,包含标准答案对比
|
||||
user_prompt = f"""
|
||||
请对以下Mermaid流程图进行评分:
|
||||
|
||||
学生提交的流程图:
|
||||
```mermaid
|
||||
{submission.mermaid_code}
|
||||
```
|
||||
"""
|
||||
if submission.problem.mermaid_code:
|
||||
user_prompt += f"""
|
||||
标准答案参考:
|
||||
```mermaid
|
||||
{submission.problem.mermaid_code}
|
||||
```
|
||||
"""
|
||||
else:
|
||||
user_prompt += "\n注意:此题没有标准答案,请根据题目描述和流程图的逻辑合理性进行评分。\n"
|
||||
|
||||
if submission.problem.flowchart_hint:
|
||||
user_prompt += f"\n设计提示:{submission.problem.flowchart_hint}\n"
|
||||
|
||||
user_prompt += "\n请按照评分标准进行详细评估,并给出0-100的分数。\n"
|
||||
|
||||
# 调用AI进行评分
|
||||
client = get_ai_client()
|
||||
|
||||
response = client.chat.completions.create(
|
||||
model="deepseek-reasoner",
|
||||
messages=[
|
||||
{"role": "system", "content": system_prompt},
|
||||
{"role": "user", "content": user_prompt}
|
||||
],
|
||||
temperature=0.3,
|
||||
)
|
||||
|
||||
ai_response = response.choices[0].message.content
|
||||
score_data = parse_ai_evaluation_response(ai_response)
|
||||
|
||||
processing_time = time.time() - start_time
|
||||
|
||||
# 保存评分结果
|
||||
with transaction.atomic():
|
||||
submission.ai_score = score_data['score']
|
||||
submission.ai_grade = score_data['grade']
|
||||
submission.ai_feedback = score_data['feedback']
|
||||
submission.ai_suggestions = score_data.get('suggestions', '')
|
||||
submission.ai_criteria_details = score_data.get('criteria_details', {})
|
||||
submission.ai_provider = 'deepseek'
|
||||
submission.ai_model = 'deepseek-reasoner'
|
||||
submission.processing_time = processing_time
|
||||
submission.status = FlowchartSubmissionStatus.COMPLETED
|
||||
submission.evaluation_time = timezone.now()
|
||||
submission.save()
|
||||
|
||||
# 推送评分完成通知
|
||||
from utils.websocket import push_flowchart_evaluation_update
|
||||
push_flowchart_evaluation_update(
|
||||
submission_id=str(submission.id),
|
||||
user_id=submission.user_id,
|
||||
data={
|
||||
"type": "flowchart_evaluation_completed",
|
||||
"score": score_data['score'],
|
||||
"grade": score_data['grade'],
|
||||
}
|
||||
)
|
||||
|
||||
except Exception as e:
|
||||
# 处理失败
|
||||
submission.status = FlowchartSubmissionStatus.FAILED
|
||||
submission.save()
|
||||
|
||||
# 推送错误通知
|
||||
from utils.websocket import push_flowchart_evaluation_update
|
||||
push_flowchart_evaluation_update(
|
||||
submission_id=str(submission.id),
|
||||
user_id=submission.user_id,
|
||||
data={
|
||||
"type": "flowchart_evaluation_failed",
|
||||
"submission_id": str(submission.id),
|
||||
"error": str(e)
|
||||
}
|
||||
)
|
||||
raise e
|
||||
|
||||
def build_evaluation_prompt(problem):
|
||||
"""构建AI评分提示词 - 使用固定标准"""
|
||||
|
||||
# 使用固定的评分标准
|
||||
criteria_text = """
|
||||
- 逻辑正确性 (权重: 1.0, 最高分: 40): 检查流程图的逻辑是否正确,包括条件判断、循环结构等
|
||||
- 完整性 (权重: 0.8, 最高分: 30): 检查流程图是否包含所有必要的步骤和分支
|
||||
- 规范性 (权重: 0.6, 最高分: 20): 检查流程图符号使用是否规范,是否符合标准
|
||||
- 清晰度 (权重: 0.4, 最高分: 10): 评估流程图的整体布局和连线情况(不用考虑节点ID是否复杂)
|
||||
"""
|
||||
|
||||
return f"""
|
||||
你是一个专业的编程教学助手,负责评估学生提交的Mermaid流程图。
|
||||
|
||||
评分标准:
|
||||
{criteria_text}
|
||||
|
||||
评分要求:
|
||||
1. 仔细分析流程图的逻辑正确性、完整性和清晰度
|
||||
2. 检查是否涵盖了题目的所有要求
|
||||
3. 评估流程图的规范性和可读性(不用考虑节点ID是否复杂)
|
||||
4. 给出0-100的分数
|
||||
5. 提供详细的反馈和改进建议
|
||||
|
||||
评分等级:
|
||||
- S级 (90-100分): 优秀,逻辑清晰,完全符合要求
|
||||
- A级 (80-89分): 良好,基本符合要求,有少量改进空间
|
||||
- B级 (70-79分): 及格,基本正确但存在一些问题
|
||||
- C级 (0-69分): 需要改进,存在明显问题
|
||||
|
||||
请以JSON格式返回评分结果:
|
||||
{{
|
||||
"score": 85,
|
||||
"grade": "A",
|
||||
"feedback": "详细的反馈内容",
|
||||
"suggestions": "改进建议",
|
||||
"criteria_details": {{
|
||||
"逻辑正确性": {{"score": 35, "max": 40, "comment": "逻辑基本正确"}},
|
||||
"完整性": {{"score": 25, "max": 30, "comment": "缺少部分步骤"}},
|
||||
"规范性": {{"score": 18, "max": 20, "comment": "符号使用规范"}},
|
||||
"清晰度": {{"score": 8, "max": 10, "comment": "布局清晰"}}
|
||||
}}
|
||||
}}
|
||||
"""
|
||||
|
||||
def parse_ai_evaluation_response(ai_response):
|
||||
"""解析AI评分响应,解析失败时抛出异常由调用方处理"""
|
||||
import re
|
||||
json_match = re.search(r'\{.*\}', ai_response, re.DOTALL)
|
||||
if not json_match:
|
||||
raise ValueError("AI响应中未找到JSON数据")
|
||||
|
||||
data = json.loads(json_match.group())
|
||||
|
||||
if "score" not in data or "grade" not in data:
|
||||
raise ValueError("AI响应缺少必要字段: score 或 grade")
|
||||
|
||||
return data
|
||||
@@ -1 +0,0 @@
|
||||
# URLs package
|
||||
@@ -1,17 +0,0 @@
|
||||
from django.urls import path
|
||||
|
||||
from ..views.oj import (
|
||||
FlowchartSubmissionAPI,
|
||||
FlowchartSubmissionCurrentAPI,
|
||||
FlowchartSubmissionDetailAPI,
|
||||
FlowchartSubmissionListAPI,
|
||||
FlowchartSubmissionRetryAPI,
|
||||
)
|
||||
|
||||
urlpatterns = [
|
||||
path('flowchart/submission', FlowchartSubmissionAPI.as_view()),
|
||||
path('flowchart/submissions', FlowchartSubmissionListAPI.as_view()),
|
||||
path('flowchart/submission/retry', FlowchartSubmissionRetryAPI.as_view()),
|
||||
path('flowchart/submission/detail', FlowchartSubmissionDetailAPI.as_view()),
|
||||
path('flowchart/submission/current', FlowchartSubmissionCurrentAPI.as_view()),
|
||||
]
|
||||
@@ -1,2 +0,0 @@
|
||||
|
||||
# Create your views here.
|
||||
@@ -1 +0,0 @@
|
||||
# Views package
|
||||
@@ -1,204 +0,0 @@
|
||||
from account.decorators import login_required
|
||||
from flowchart.models import FlowchartSubmission, FlowchartSubmissionStatus
|
||||
from flowchart.serializers import (
|
||||
CreateFlowchartSubmissionSerializer,
|
||||
FlowchartSubmissionListSerializer,
|
||||
FlowchartSubmissionSerializer,
|
||||
)
|
||||
from flowchart.tasks import evaluate_flowchart_task
|
||||
from problem.models import Problem
|
||||
from utils.api import APIView
|
||||
|
||||
|
||||
class FlowchartSubmissionAPI(APIView):
|
||||
@login_required
|
||||
def post(self, request):
|
||||
"""创建流程图提交"""
|
||||
serializer = CreateFlowchartSubmissionSerializer(data=request.data)
|
||||
if not serializer.is_valid():
|
||||
return self.error(serializer.errors)
|
||||
|
||||
data = serializer.validated_data
|
||||
|
||||
# 验证题目存在
|
||||
try:
|
||||
problem = Problem.objects.get(id=data["problem_id"])
|
||||
except Problem.DoesNotExist:
|
||||
return self.error("Problem doesn't exist")
|
||||
|
||||
# 验证题目是否允许流程图提交
|
||||
if not problem.allow_flowchart:
|
||||
return self.error("This problem does not allow flowchart submission")
|
||||
|
||||
# 创建提交记录
|
||||
submission = FlowchartSubmission.objects.create(
|
||||
user=request.user,
|
||||
problem=problem,
|
||||
mermaid_code=data["mermaid_code"],
|
||||
flowchart_data=data.get("flowchart_data", {}),
|
||||
)
|
||||
|
||||
# 启动AI评分任务
|
||||
evaluate_flowchart_task.send(submission.id)
|
||||
|
||||
return self.success({"submission_id": submission.id, "status": "pending"})
|
||||
|
||||
@login_required
|
||||
def get(self, request):
|
||||
"""获取流程图提交详情"""
|
||||
submission_id = request.GET.get("id")
|
||||
if not submission_id:
|
||||
return self.error("submission_id is required")
|
||||
|
||||
try:
|
||||
submission = FlowchartSubmission.objects.get(id=submission_id)
|
||||
except FlowchartSubmission.DoesNotExist:
|
||||
return self.error("Submission doesn't exist")
|
||||
|
||||
if not submission.check_user_permission(request.user):
|
||||
return self.error("No permission for this submission")
|
||||
|
||||
serializer = FlowchartSubmissionSerializer(submission)
|
||||
return self.success(serializer.data)
|
||||
|
||||
|
||||
class FlowchartSubmissionListAPI(APIView):
|
||||
@login_required
|
||||
def get(self, request):
|
||||
"""获取流程图提交列表"""
|
||||
username = request.GET.get("username")
|
||||
problem_id = request.GET.get("problem_id")
|
||||
myself = request.GET.get("myself")
|
||||
|
||||
queryset = FlowchartSubmission.objects.select_related("user", "problem")
|
||||
|
||||
if problem_id:
|
||||
try:
|
||||
problem = Problem.objects.get(
|
||||
_id=problem_id, contest_id__isnull=True, visible=True
|
||||
)
|
||||
except Problem.DoesNotExist:
|
||||
return self.error("Problem doesn't exist")
|
||||
queryset = queryset.filter(problem=problem)
|
||||
if myself and myself == "1":
|
||||
queryset = queryset.filter(user=request.user)
|
||||
if username:
|
||||
queryset = queryset.filter(user__username__icontains=username)
|
||||
|
||||
data = self.paginate_data(request, queryset)
|
||||
data["results"] = FlowchartSubmissionListSerializer(
|
||||
data["results"], many=True
|
||||
).data
|
||||
return self.success(data)
|
||||
|
||||
|
||||
class FlowchartSubmissionRetryAPI(APIView):
|
||||
@login_required
|
||||
def post(self, request):
|
||||
"""重新触发AI评分"""
|
||||
submission_id = request.data.get("submission_id")
|
||||
if not submission_id:
|
||||
return self.error("submission_id is required")
|
||||
|
||||
try:
|
||||
submission = FlowchartSubmission.objects.get(id=submission_id)
|
||||
except FlowchartSubmission.DoesNotExist:
|
||||
return self.error("Submission doesn't exist")
|
||||
|
||||
# 检查权限
|
||||
if not submission.check_user_permission(request.user):
|
||||
return self.error("No permission for this submission")
|
||||
|
||||
# 检查是否可以重新评分
|
||||
if submission.status not in [
|
||||
FlowchartSubmissionStatus.FAILED,
|
||||
FlowchartSubmissionStatus.COMPLETED,
|
||||
]:
|
||||
return self.error("Submission is not in a state that allows retry")
|
||||
|
||||
# 重置状态并重新启动AI评分
|
||||
submission.status = FlowchartSubmissionStatus.PENDING
|
||||
submission.ai_score = None
|
||||
submission.ai_grade = None
|
||||
submission.ai_feedback = None
|
||||
submission.ai_suggestions = None
|
||||
submission.ai_criteria_details = {}
|
||||
submission.processing_time = None
|
||||
submission.evaluation_time = None
|
||||
submission.save()
|
||||
|
||||
# 重新启动AI评分任务
|
||||
evaluate_flowchart_task.send(submission.id)
|
||||
|
||||
return self.success(
|
||||
{
|
||||
"submission_id": submission.id,
|
||||
"status": "pending",
|
||||
"message": "AI evaluation restarted",
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
class FlowchartSubmissionDetailAPI(APIView):
|
||||
@login_required
|
||||
def get(self, request):
|
||||
"""获取当前用户对指定题目的流程图提交详情"""
|
||||
problem_id = request.GET.get("problem_id")
|
||||
if not problem_id:
|
||||
return self.error("problem_id is required")
|
||||
try:
|
||||
problem = Problem.objects.get(id=problem_id)
|
||||
except Problem.DoesNotExist:
|
||||
return self.error("Problem doesn't exist")
|
||||
|
||||
page = int(request.GET.get("page", 0))
|
||||
submissions = FlowchartSubmission.objects.filter(
|
||||
user=request.user,
|
||||
problem=problem,
|
||||
status=FlowchartSubmissionStatus.COMPLETED,
|
||||
).order_by("create_time")
|
||||
count = submissions.count()
|
||||
if count == 0:
|
||||
return self.success({"submission": None, "count": 0})
|
||||
# page=0 means latest; page=N means the Nth submission (1-indexed, chronological)
|
||||
if page == 0:
|
||||
submission = submissions.last()
|
||||
else:
|
||||
if page < 0 or page > count:
|
||||
return self.error("Page out of range")
|
||||
submission = submissions[page - 1]
|
||||
serializer = FlowchartSubmissionSerializer(submission)
|
||||
return self.success({"submission": serializer.data, "count": count})
|
||||
|
||||
|
||||
class FlowchartSubmissionCurrentAPI(APIView):
|
||||
@login_required
|
||||
def get(self, request):
|
||||
"""获取当前用户对指定题目的最新流程图提交,只返回次数和分数"""
|
||||
problem_id = request.GET.get("problem_id")
|
||||
if not problem_id:
|
||||
return self.error("problem_id is required")
|
||||
try:
|
||||
problem = Problem.objects.get(id=problem_id)
|
||||
except Problem.DoesNotExist:
|
||||
return self.error("Problem doesn't exist")
|
||||
submissions = (
|
||||
FlowchartSubmission.objects.filter(
|
||||
user=request.user,
|
||||
problem=problem,
|
||||
status=FlowchartSubmissionStatus.COMPLETED,
|
||||
)
|
||||
.values("ai_score", "ai_grade")
|
||||
.order_by("-create_time")
|
||||
)
|
||||
count = submissions.count()
|
||||
if count == 0:
|
||||
return self.success({"count": 0, "score": 0, "grade": ""})
|
||||
submission = submissions[0]
|
||||
return self.success(
|
||||
{
|
||||
"count": count,
|
||||
"score": submission["ai_score"],
|
||||
"grade": submission["ai_grade"],
|
||||
}
|
||||
)
|
||||
@@ -1,11 +1,11 @@
|
||||
#!/usr/bin/env python3
|
||||
import base64
|
||||
import copy
|
||||
import random
|
||||
import string
|
||||
import hashlib
|
||||
import json
|
||||
import os
|
||||
import random
|
||||
import string
|
||||
import xml.etree.ElementTree as ET
|
||||
|
||||
|
||||
|
||||
@@ -4,19 +4,18 @@ import logging
|
||||
from urllib.parse import urljoin
|
||||
|
||||
import requests
|
||||
from django.db import IntegrityError, transaction
|
||||
from django.db import transaction, IntegrityError
|
||||
from django.db.models import F
|
||||
|
||||
from account.models import User
|
||||
from conf.models import JudgeServer
|
||||
from contest.models import ACMContestRank, ContestRuleType, ContestStatus, OIContestRank
|
||||
from contest.models import ContestRuleType, ACMContestRank, OIContestRank, ContestStatus
|
||||
from options.options import SysOptions
|
||||
from problem.models import Problem, ProblemRuleType
|
||||
from problem.utils import parse_problem_template
|
||||
from submission.models import JudgeStatus, Submission
|
||||
from utils.cache import cache
|
||||
from utils.constants import CacheKey
|
||||
from utils.websocket import push_submission_update
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@@ -67,6 +66,26 @@ class DispatcherBase(object):
|
||||
logger.exception(e)
|
||||
|
||||
|
||||
class SPJCompiler(DispatcherBase):
|
||||
def __init__(self, spj_code, spj_version, spj_language):
|
||||
super().__init__()
|
||||
spj_compile_config = list(filter(lambda config: spj_language == config["name"], SysOptions.spj_languages))[0]["spj"][
|
||||
"compile"]
|
||||
self.data = {
|
||||
"src": spj_code,
|
||||
"spj_version": spj_version,
|
||||
"spj_compile_config": spj_compile_config
|
||||
}
|
||||
|
||||
def compile_spj(self):
|
||||
with ChooseJudgeServer() as server:
|
||||
if not server:
|
||||
return "No available judge_server"
|
||||
result = self._request(urljoin(server.service_url, "compile_spj"), data=self.data)
|
||||
if not result:
|
||||
return "Failed to call judge server"
|
||||
if result["err"]:
|
||||
return result["data"]
|
||||
|
||||
|
||||
class JudgeDispatcher(DispatcherBase):
|
||||
@@ -106,6 +125,12 @@ class JudgeDispatcher(DispatcherBase):
|
||||
def judge(self):
|
||||
language = self.submission.language
|
||||
sub_config = list(filter(lambda item: language == item["name"], SysOptions.languages))[0]
|
||||
spj_config = {}
|
||||
if self.problem.spj_code:
|
||||
for lang in SysOptions.spj_languages:
|
||||
if lang["name"] == self.problem.spj_language:
|
||||
spj_config = lang["spj"]
|
||||
break
|
||||
|
||||
if language in self.problem.template:
|
||||
template = parse_problem_template(self.problem.template[language])
|
||||
@@ -120,6 +145,10 @@ class JudgeDispatcher(DispatcherBase):
|
||||
"max_memory": 1024 * 1024 * self.problem.memory_limit,
|
||||
"test_case_id": self.problem.test_case_id,
|
||||
"output": False,
|
||||
"spj_version": self.problem.spj_version,
|
||||
"spj_config": spj_config.get("config"),
|
||||
"spj_compile_config": spj_config.get("compile"),
|
||||
"spj_src": self.problem.spj_code,
|
||||
"io_mode": self.problem.io_mode
|
||||
}
|
||||
|
||||
@@ -127,56 +156,12 @@ class JudgeDispatcher(DispatcherBase):
|
||||
if not server:
|
||||
data = {"submission_id": self.submission.id, "problem_id": self.problem.id}
|
||||
cache.lpush(CacheKey.waiting_queue, json.dumps(data))
|
||||
# 推送排队状态
|
||||
try:
|
||||
push_submission_update(
|
||||
submission_id=str(self.submission.id),
|
||||
user_id=self.submission.user_id,
|
||||
data={
|
||||
"type": "submission_update",
|
||||
"submission_id": str(self.submission.id),
|
||||
"result": JudgeStatus.PENDING,
|
||||
"status": "pending",
|
||||
}
|
||||
)
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to push submission update: {str(e)}")
|
||||
return
|
||||
Submission.objects.filter(id=self.submission.id).update(result=JudgeStatus.JUDGING)
|
||||
|
||||
# 推送判题中状态
|
||||
try:
|
||||
push_submission_update(
|
||||
submission_id=str(self.submission.id),
|
||||
user_id=self.submission.user_id,
|
||||
data={
|
||||
"type": "submission_update",
|
||||
"submission_id": str(self.submission.id),
|
||||
"result": JudgeStatus.JUDGING,
|
||||
"status": "judging",
|
||||
}
|
||||
)
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to push submission update: {str(e)}")
|
||||
|
||||
resp = self._request(urljoin(server.service_url, "/judge"), data=data)
|
||||
|
||||
if not resp:
|
||||
Submission.objects.filter(id=self.submission.id).update(result=JudgeStatus.SYSTEM_ERROR)
|
||||
# 推送系统错误状态
|
||||
try:
|
||||
push_submission_update(
|
||||
submission_id=str(self.submission.id),
|
||||
user_id=self.submission.user_id,
|
||||
data={
|
||||
"type": "submission_update",
|
||||
"submission_id": str(self.submission.id),
|
||||
"result": JudgeStatus.SYSTEM_ERROR,
|
||||
"status": "error",
|
||||
}
|
||||
)
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to push submission update: {str(e)}")
|
||||
return
|
||||
|
||||
if resp["err"]:
|
||||
@@ -197,24 +182,6 @@ class JudgeDispatcher(DispatcherBase):
|
||||
else:
|
||||
self.submission.result = JudgeStatus.PARTIALLY_ACCEPTED
|
||||
self.submission.save()
|
||||
|
||||
# 推送判题完成状态
|
||||
try:
|
||||
push_submission_update(
|
||||
submission_id=str(self.submission.id),
|
||||
user_id=self.submission.user_id,
|
||||
data={
|
||||
"type": "submission_update",
|
||||
"submission_id": str(self.submission.id),
|
||||
"result": self.submission.result,
|
||||
"status": "finished",
|
||||
"time_cost": self.submission.statistic_info.get("time_cost"),
|
||||
"memory_cost": self.submission.statistic_info.get("memory_cost"),
|
||||
"score": self.submission.statistic_info.get("score", 0),
|
||||
}
|
||||
)
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to push submission update: {str(e)}")
|
||||
|
||||
if self.contest_id:
|
||||
if self.contest.status != ContestStatus.CONTEST_UNDERWAY or \
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
from problem.models import ProblemIOMode
|
||||
|
||||
|
||||
default_env = ["LANG=en_US.UTF-8", "LANGUAGE=en_US:en", "LC_ALL=en_US.UTF-8"]
|
||||
|
||||
_c_lang_config = {
|
||||
@@ -34,6 +35,20 @@ int main() {
|
||||
}
|
||||
}
|
||||
|
||||
_c_lang_spj_compile = {
|
||||
"src_name": "spj-{spj_version}.c",
|
||||
"exe_name": "spj-{spj_version}",
|
||||
"max_cpu_time": 3000,
|
||||
"max_real_time": 10000,
|
||||
"max_memory": 1024 * 1024 * 1024,
|
||||
"compile_command": "/usr/bin/gcc -DONLINE_JUDGE -O2 -w -fmax-errors=3 -std=c17 {src_path} -lm -o {exe_path}"
|
||||
}
|
||||
|
||||
_c_lang_spj_config = {
|
||||
"exe_name": "spj-{spj_version}",
|
||||
"command": "{exe_path} {in_file_path} {user_out_file_path}",
|
||||
"seccomp_rule": "c_cpp"
|
||||
}
|
||||
|
||||
_cpp_lang_config = {
|
||||
"template": """//PREPEND BEGIN
|
||||
@@ -67,6 +82,20 @@ int main() {
|
||||
}
|
||||
}
|
||||
|
||||
_cpp_lang_spj_compile = {
|
||||
"src_name": "spj-{spj_version}.cpp",
|
||||
"exe_name": "spj-{spj_version}",
|
||||
"max_cpu_time": 10000,
|
||||
"max_real_time": 20000,
|
||||
"max_memory": 1024 * 1024 * 1024,
|
||||
"compile_command": "/usr/bin/g++ -DONLINE_JUDGE -O2 -w -fmax-errors=3 -std=c++20 {src_path} -lm -o {exe_path}"
|
||||
}
|
||||
|
||||
_cpp_lang_spj_config = {
|
||||
"exe_name": "spj-{spj_version}",
|
||||
"command": "{exe_path} {in_file_path} {user_out_file_path}",
|
||||
"seccomp_rule": "c_cpp"
|
||||
}
|
||||
|
||||
_java_lang_config = {
|
||||
"template": """//PREPEND BEGIN
|
||||
@@ -195,8 +224,10 @@ console.log(add(1, 2))
|
||||
}
|
||||
|
||||
languages = [
|
||||
{"config": _c_lang_config, "name": "C", "description": "GCC 13", "content_type": "text/x-csrc"},
|
||||
{"config": _cpp_lang_config, "name": "C++", "description": "GCC 13", "content_type": "text/x-c++src"},
|
||||
{"config": _c_lang_config, "name": "C", "description": "GCC 13", "content_type": "text/x-csrc",
|
||||
"spj": {"compile": _c_lang_spj_compile, "config": _c_lang_spj_config}},
|
||||
{"config": _cpp_lang_config, "name": "C++", "description": "GCC 13", "content_type": "text/x-c++src",
|
||||
"spj": {"compile": _cpp_lang_spj_compile, "config": _cpp_lang_spj_config}},
|
||||
{"config": _java_lang_config, "name": "Java", "description": "Temurin 21", "content_type": "text/x-java"},
|
||||
{"config": _py3_lang_config, "name": "Python3", "description": "Python 3.12", "content_type": "text/x-python"},
|
||||
{"config": _go_lang_config, "name": "Golang", "description": "Golang 1.22", "content_type": "text/x-go"},
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
import dramatiq
|
||||
|
||||
from account.models import User
|
||||
from judge.dispatcher import JudgeDispatcher
|
||||
from submission.models import Submission
|
||||
from judge.dispatcher import JudgeDispatcher
|
||||
from utils.shortcuts import DRAMATIQ_WORKER_ARGS
|
||||
|
||||
|
||||
|
||||
@@ -5,8 +5,8 @@ import sys
|
||||
if __name__ == "__main__":
|
||||
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "oj.settings")
|
||||
|
||||
import django
|
||||
from django.core.management import execute_from_command_line
|
||||
import django
|
||||
sys.stdout.write("Django VERSION " + str(django.VERSION) + "\n")
|
||||
|
||||
execute_from_command_line(sys.argv)
|
||||
|
||||
@@ -1,20 +0,0 @@
|
||||
# Generated by Django 6.0 on 2026-04-23 20:07
|
||||
|
||||
from django.conf import settings
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('message', '0001_initial'),
|
||||
('submission', '0004_submission_problem_user_idx'),
|
||||
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddIndex(
|
||||
model_name='message',
|
||||
index=models.Index(fields=['recipient', 'create_time'], name='message_recipient_time_idx'),
|
||||
),
|
||||
]
|
||||
@@ -1,5 +1,4 @@
|
||||
from django.db import models
|
||||
|
||||
from account.models import User
|
||||
from submission.models import Submission
|
||||
from utils.models import RichTextField
|
||||
@@ -17,6 +16,3 @@ class Message(models.Model):
|
||||
class Meta:
|
||||
db_table = "message"
|
||||
ordering = ("-create_time",)
|
||||
indexes = [
|
||||
models.Index(fields=["recipient", "create_time"], name="message_recipient_time_idx"),
|
||||
]
|
||||
|
||||
@@ -1,9 +1,9 @@
|
||||
from submission.serializers import SubmissionSafeModelSerializer
|
||||
from utils.api import UsernameSerializer, serializers
|
||||
|
||||
from .models import Message
|
||||
|
||||
|
||||
|
||||
class MessageSerializer(serializers.ModelSerializer):
|
||||
sender = UsernameSerializer()
|
||||
submission = SubmissionSafeModelSerializer()
|
||||
|
||||
@@ -1,9 +1,11 @@
|
||||
from account.decorators import login_required, super_admin_required
|
||||
from account.decorators import super_admin_required, login_required
|
||||
from account.models import User
|
||||
from message.models import Message
|
||||
from message.serializers import CreateMessageSerializer, MessageSerializer
|
||||
from submission.models import Submission
|
||||
from utils.api import APIView
|
||||
|
||||
from message.models import Message
|
||||
|
||||
from utils.api.api import validate_serializer
|
||||
|
||||
|
||||
@@ -11,7 +13,7 @@ class MessageAPI(APIView):
|
||||
@login_required
|
||||
def get(self, request):
|
||||
messages = Message.objects.select_related(
|
||||
"recipient", "sender", "submission", "submission__problem"
|
||||
"recipient", "sender", "submission"
|
||||
).filter(recipient=request.user)
|
||||
return self.success(self.paginate_data(request, messages, MessageSerializer))
|
||||
|
||||
|
||||
38
oj/asgi.py
38
oj/asgi.py
@@ -1,31 +1,7 @@
|
||||
"""
|
||||
ASGI config for oj project.
|
||||
|
||||
It exposes the ASGI callable as a module-level variable named ``application``.
|
||||
|
||||
For more information on this file, see
|
||||
https://docs.djangoproject.com/en/5.2/howto/deployment/asgi/
|
||||
"""
|
||||
|
||||
import os
|
||||
|
||||
from channels.auth import AuthMiddlewareStack
|
||||
from channels.routing import ProtocolTypeRouter, URLRouter
|
||||
from django.core.asgi import get_asgi_application
|
||||
|
||||
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "oj.settings")
|
||||
|
||||
# Initialize Django ASGI application early to ensure the AppRegistry
|
||||
# is populated before importing code that may import ORM models.
|
||||
django_asgi_app = get_asgi_application()
|
||||
|
||||
# Import routing after Django setup
|
||||
from oj.routing import websocket_urlpatterns # noqa: E402
|
||||
|
||||
application = ProtocolTypeRouter(
|
||||
{
|
||||
"http": django_asgi_app,
|
||||
"websocket": AuthMiddlewareStack(URLRouter(websocket_urlpatterns)),
|
||||
}
|
||||
)
|
||||
|
||||
import os
|
||||
|
||||
from django.core.asgi import get_asgi_application
|
||||
|
||||
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "oj.settings")
|
||||
|
||||
application = get_asgi_application()
|
||||
|
||||
@@ -1,22 +1,19 @@
|
||||
# coding=utf-8
|
||||
import os
|
||||
from utils.shortcuts import get_env
|
||||
|
||||
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
|
||||
|
||||
DATABASES = {
|
||||
"default": {
|
||||
"ENGINE": "django.db.backends.postgresql",
|
||||
"HOST": "150.158.29.156",
|
||||
"PORT": "5455",
|
||||
"NAME": "onlinejudge",
|
||||
"USER": "onlinejudge",
|
||||
"PASSWORD": "onlinejudge",
|
||||
"ENGINE": "django.db.backends.sqlite3",
|
||||
"NAME": os.path.join(BASE_DIR, "db.sqlite3"),
|
||||
}
|
||||
}
|
||||
|
||||
REDIS_CONF = {
|
||||
"host": "150.158.29.156",
|
||||
"port": 5456,
|
||||
"host": get_env("REDIS_HOST", "127.0.0.1"),
|
||||
"port": get_env("REDIS_PORT", "6380"),
|
||||
}
|
||||
|
||||
|
||||
|
||||
@@ -1,16 +0,0 @@
|
||||
"""
|
||||
WebSocket URL Configuration for oj project.
|
||||
"""
|
||||
|
||||
from django.urls import path
|
||||
|
||||
from conf.consumers import ConfigConsumer
|
||||
from flowchart.consumers import FlowchartConsumer
|
||||
from submission.consumers import SubmissionConsumer
|
||||
|
||||
websocket_urlpatterns = [
|
||||
path("ws/submission/", SubmissionConsumer.as_asgi()),
|
||||
path("ws/config/", ConfigConsumer.as_asgi()),
|
||||
path("ws/flowchart/", FlowchartConsumer.as_asgi()),
|
||||
]
|
||||
|
||||
@@ -10,14 +10,9 @@ For the full list of settings and their values, see
|
||||
https://docs.djangoproject.com/en/1.8/ref/settings/
|
||||
"""
|
||||
|
||||
import logging
|
||||
import os
|
||||
import raven
|
||||
from copy import deepcopy
|
||||
|
||||
import sentry_sdk
|
||||
from sentry_sdk.integrations.django import DjangoIntegration
|
||||
from sentry_sdk.integrations.logging import LoggingIntegration
|
||||
|
||||
from utils.shortcuts import get_env
|
||||
|
||||
production_env = get_env("OJ_ENV", "dev") == "production"
|
||||
@@ -33,18 +28,19 @@ BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
|
||||
|
||||
# Applications
|
||||
VENDOR_APPS = [
|
||||
"daphne", # Channels ASGI server - must be first
|
||||
"django.contrib.auth",
|
||||
"django.contrib.sessions",
|
||||
"django.contrib.contenttypes",
|
||||
"django.contrib.messages",
|
||||
"django.contrib.staticfiles",
|
||||
"rest_framework",
|
||||
"channels",
|
||||
"django_dramatiq",
|
||||
"django_dbconn_retry",
|
||||
]
|
||||
|
||||
if production_env:
|
||||
VENDOR_APPS.append("raven.contrib.django.raven_compat")
|
||||
|
||||
|
||||
LOCAL_APPS = [
|
||||
"account",
|
||||
@@ -59,10 +55,6 @@ LOCAL_APPS = [
|
||||
"message",
|
||||
"comment",
|
||||
"tutorial",
|
||||
"ai",
|
||||
"flowchart",
|
||||
"problemset",
|
||||
"class_pk",
|
||||
]
|
||||
|
||||
INSTALLED_APPS = VENDOR_APPS + LOCAL_APPS
|
||||
@@ -99,9 +91,6 @@ TEMPLATES = [
|
||||
]
|
||||
WSGI_APPLICATION = "oj.wsgi.application"
|
||||
|
||||
# ASGI Application for WebSocket support
|
||||
ASGI_APPLICATION = "oj.asgi.application"
|
||||
|
||||
# Password validation
|
||||
# https://docs.djangoproject.com/en/1.9/ref/settings/#auth-password-validators
|
||||
|
||||
@@ -123,9 +112,13 @@ AUTH_PASSWORD_VALIDATORS = [
|
||||
# Internationalization
|
||||
# https://docs.djangoproject.com/en/1.8/topics/i18n/
|
||||
|
||||
LANGUAGE_CODE = "zh-cn"
|
||||
LANGUAGE_CODE = "en-us"
|
||||
|
||||
TIME_ZONE = "Asia/Shanghai"
|
||||
TIME_ZONE = "UTC"
|
||||
|
||||
USE_I18N = True
|
||||
|
||||
USE_L10N = True
|
||||
|
||||
USE_TZ = True
|
||||
|
||||
@@ -150,19 +143,7 @@ HITOKOTO_DIR = os.path.join(DATA_DIR, "hitokoto")
|
||||
STATICFILES_DIRS = [os.path.join(DATA_DIR, "public")]
|
||||
|
||||
|
||||
SENTRY_DSN = get_env("SENTRY_DSN")
|
||||
if production_env and SENTRY_DSN:
|
||||
sentry_sdk.init(
|
||||
dsn=SENTRY_DSN,
|
||||
integrations=[
|
||||
DjangoIntegration(),
|
||||
LoggingIntegration(level=logging.INFO, event_level=logging.ERROR),
|
||||
],
|
||||
send_default_pii=False,
|
||||
)
|
||||
|
||||
|
||||
LOGGING_HANDLERS = ["console"]
|
||||
LOGGING_HANDLERS = ["console", "sentry"] if production_env else ["console"]
|
||||
LOGGING = {
|
||||
"version": 1,
|
||||
"disable_existing_loggers": False,
|
||||
@@ -178,6 +159,11 @@ LOGGING = {
|
||||
"class": "logging.StreamHandler",
|
||||
"formatter": "standard",
|
||||
},
|
||||
"sentry": {
|
||||
"level": "ERROR",
|
||||
"class": "raven.contrib.django.raven_compat.handlers.SentryHandler",
|
||||
"formatter": "standard",
|
||||
},
|
||||
},
|
||||
"loggers": {
|
||||
"django.request": {
|
||||
@@ -224,23 +210,12 @@ def redis_config(db):
|
||||
}
|
||||
|
||||
|
||||
CACHES = {"default": redis_config(db=1)}
|
||||
if production_env:
|
||||
CACHES = {"default": redis_config(db=1)}
|
||||
|
||||
SESSION_ENGINE = "django.contrib.sessions.backends.cache"
|
||||
SESSION_CACHE_ALIAS = "default"
|
||||
|
||||
# Channels Configuration
|
||||
CHANNEL_LAYERS = {
|
||||
"default": {
|
||||
"BACKEND": "channels_redis.core.RedisChannelLayer",
|
||||
"CONFIG": {
|
||||
"hosts": [(REDIS_CONF["host"], REDIS_CONF["port"])],
|
||||
"capacity": 1500, # 每个频道的最大消息数
|
||||
"expiry": 10, # 消息过期时间(秒)
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
DRAMATIQ_BROKER = {
|
||||
"BROKER": "dramatiq.brokers.redis.RedisBroker",
|
||||
"OPTIONS": {
|
||||
@@ -265,6 +240,10 @@ DRAMATIQ_RESULT_BACKEND = {
|
||||
"MIDDLEWARE_OPTIONS": {"result_ttl": None},
|
||||
}
|
||||
|
||||
RAVEN_CONFIG = {
|
||||
"dsn": "https://b200023b8aed4d708fb593c5e0a6ad3d:1fddaba168f84fcf97e0d549faaeaff0@sentry.io/263057"
|
||||
}
|
||||
|
||||
IP_HEADER = "HTTP_X_REAL_IP"
|
||||
|
||||
DEFAULT_AUTO_FIELD = "django.db.models.AutoField"
|
||||
|
||||
@@ -19,9 +19,4 @@ urlpatterns = [
|
||||
path("api/admin/", include("comment.urls.admin")),
|
||||
path("api/", include("tutorial.urls.tutorial")),
|
||||
path("api/admin/", include("tutorial.urls.admin")),
|
||||
path("api/", include("ai.urls.oj")),
|
||||
path("api/", include("flowchart.urls.oj")),
|
||||
path("api/", include("problemset.urls.oj")),
|
||||
path("api/admin/", include("problemset.urls.admin")),
|
||||
path("api/", include("class_pk.urls.oj")),
|
||||
]
|
||||
|
||||
@@ -1,5 +1,4 @@
|
||||
from django.db import models
|
||||
|
||||
from utils.models import JSONField
|
||||
|
||||
|
||||
|
||||
@@ -3,11 +3,10 @@ import os
|
||||
import threading
|
||||
import time
|
||||
|
||||
from django.db import IntegrityError, transaction
|
||||
from django.db import transaction, IntegrityError
|
||||
|
||||
from judge.languages import languages
|
||||
from utils.shortcuts import rand_str
|
||||
|
||||
from judge.languages import languages
|
||||
from .models import SysOptions as SysOptionsModel
|
||||
|
||||
|
||||
@@ -105,7 +104,6 @@ class OptionKeys:
|
||||
judge_server_token = "judge_server_token"
|
||||
throttling = "throttling"
|
||||
languages = "languages"
|
||||
enable_maxkb = "enable_maxkb"
|
||||
|
||||
|
||||
class OptionDefaultValue:
|
||||
@@ -121,7 +119,6 @@ class OptionDefaultValue:
|
||||
throttling = {"ip": {"capacity": 100, "fill_rate": 0.1, "default_capacity": 50},
|
||||
"user": {"capacity": 20, "fill_rate": 0.03, "default_capacity": 10}}
|
||||
languages = languages
|
||||
enable_maxkb = True
|
||||
|
||||
|
||||
class _SysOptionsMeta(type):
|
||||
@@ -218,7 +215,7 @@ class _SysOptionsMeta(type):
|
||||
def website_footer(cls, value):
|
||||
cls._set_option(OptionKeys.website_footer, value)
|
||||
|
||||
@my_property(ttl=DEFAULT_SHORT_TTL)
|
||||
@my_property
|
||||
def allow_register(cls):
|
||||
return cls._get_option(OptionKeys.allow_register)
|
||||
|
||||
@@ -250,7 +247,7 @@ class _SysOptionsMeta(type):
|
||||
def smtp_config(cls, value):
|
||||
cls._set_option(OptionKeys.smtp_config, value)
|
||||
|
||||
@my_property(ttl=DEFAULT_SHORT_TTL)
|
||||
@my_property
|
||||
def judge_server_token(cls):
|
||||
return cls._get_option(OptionKeys.judge_server_token)
|
||||
|
||||
@@ -258,7 +255,7 @@ class _SysOptionsMeta(type):
|
||||
def judge_server_token(cls, value):
|
||||
cls._set_option(OptionKeys.judge_server_token, value)
|
||||
|
||||
@my_property(ttl=DEFAULT_SHORT_TTL)
|
||||
@my_property
|
||||
def throttling(cls):
|
||||
return cls._get_option(OptionKeys.throttling)
|
||||
|
||||
@@ -274,18 +271,17 @@ class _SysOptionsMeta(type):
|
||||
def languages(cls, value):
|
||||
cls._set_option(OptionKeys.languages, value)
|
||||
|
||||
@my_property(ttl=DEFAULT_SHORT_TTL)
|
||||
def spj_languages(cls):
|
||||
return [item for item in cls.languages if "spj" in item]
|
||||
|
||||
@my_property(ttl=DEFAULT_SHORT_TTL)
|
||||
def language_names(cls):
|
||||
return [item["name"] for item in cls.languages]
|
||||
|
||||
@my_property(ttl=DEFAULT_SHORT_TTL)
|
||||
def enable_maxkb(cls):
|
||||
return cls._get_option(OptionKeys.enable_maxkb)
|
||||
|
||||
@enable_maxkb.setter
|
||||
def enable_maxkb(cls, value):
|
||||
cls._set_option(OptionKeys.enable_maxkb, value)
|
||||
|
||||
def spj_language_names(cls):
|
||||
return [item["name"] for item in cls.languages if "spj" in item]
|
||||
|
||||
def reset_languages(cls):
|
||||
cls.languages = languages
|
||||
|
||||
1
options/tests.py
Normal file
1
options/tests.py
Normal file
@@ -0,0 +1 @@
|
||||
# Create your tests here.
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user