Compare commits

..

47 Commits

Author SHA1 Message Date
5e7d801075 Merge pull request 'feature/api_endpoint' (#5) from feature/api_endpoint into main
All checks were successful
Backend CI / unit-test (push) Successful in 43s
Reviewed-on: #5
2025-10-01 15:55:47 +02:00
94fb4705ff add tests for router and openapi, still need to add routes for update interest
All checks were successful
Backend CI / unit-test (push) Successful in 1m10s
Backend CI / unit-test (pull_request) Successful in 44s
2025-10-01 15:53:48 +02:00
bb87b90285 service layer add all tests for existing code
All checks were successful
Backend CI / unit-test (push) Successful in 40s
2025-09-29 16:48:28 +02:00
5eae75b23e wip service test
Some checks failed
Backend CI / unit-test (push) Failing after 37s
2025-09-26 22:37:26 +02:00
6a5f160d83 add interest accural test, improve migration tests
All checks were successful
Backend CI / unit-test (push) Successful in 37s
2025-09-25 22:16:24 +02:00
27b4adaca4 add interest change tables 2025-09-25 12:08:07 +02:00
e66aab99ea basic api is there
All checks were successful
Backend CI / unit-test (push) Successful in 35s
2025-09-24 21:02:21 +02:00
80fc405bf6 Almost finish basic functionalities
All checks were successful
Backend CI / unit-test (push) Successful in 36s
2025-09-24 17:33:27 +02:00
cf6c826468 use utils module 2025-09-24 10:44:32 +02:00
a6592bd140 wip 2025-09-23 23:35:15 +02:00
92c4e0d4fc refine type checking
All checks were successful
Backend CI / unit-test (push) Successful in 35s
2025-09-23 17:37:14 +02:00
544f5e8c92 Merge pull request 'add readme' (#4) from feature/readme into main
All checks were successful
Backend CI / unit-test (push) Successful in 34s
Reviewed-on: #4
2025-09-23 10:44:11 +02:00
b6ba108156 add readme
All checks were successful
Backend CI / unit-test (push) Successful in 33s
Backend CI / unit-test (pull_request) Successful in 33s
2025-09-23 10:43:32 +02:00
b68249f9f1 add create get exchange endpoint
All checks were successful
Backend CI / unit-test (push) Successful in 34s
2025-09-22 23:07:28 +02:00
1750401278 several changes:
All checks were successful
Backend CI / unit-test (push) Successful in 34s
* api calls for auth

* exchange now bind to user
2025-09-22 22:51:59 +02:00
466e6ce653 wip user reg
All checks were successful
Backend CI / unit-test (push) Successful in 34s
2025-09-22 17:35:10 +02:00
e70a63e4f9 add security py 2025-09-22 14:54:29 +02:00
76ed38e9af add crud for exchange 2025-09-22 14:39:33 +02:00
1fbc93353d add exchange table
All checks were successful
Backend CI / unit-test (push) Successful in 35s
2025-09-22 14:33:32 +02:00
76cc967c42 cycle and trade add exchange field 2025-09-19 23:04:17 +02:00
442da655c0 Fix linting error and linting config 2025-09-19 15:30:41 +02:00
07d33c4568 Merge pull request 'feature/db_add_session' (#3) from feature/db_add_session into main
All checks were successful
Backend CI / unit-test (push) Successful in 30s
Reviewed-on: #3
2025-09-19 14:21:11 +02:00
9f3010d300 fix path typo
All checks were successful
Backend CI / unit-test (push) Successful in 30s
Backend CI / unit-test (pull_request) Successful in 30s
2025-09-19 14:18:21 +02:00
bc264c8014 use workspace
Some checks failed
Backend CI / unit-test (push) Failing after 26s
2025-09-19 14:16:25 +02:00
afd342b31f cwd for compare model
Some checks failed
Backend CI / unit-test (push) Failing after 26s
2025-09-19 14:14:51 +02:00
39fc10572e add ci to compare models
Some checks failed
Backend CI / unit-test (push) Failing after 25s
2025-09-19 14:13:05 +02:00
2fbf1e9e01 Add session db 2025-09-19 14:06:32 +02:00
0bc85c1faf Merge pull request 'feature/db' (#2) from feature/db into main
All checks were successful
Backend CI / unit-test (push) Successful in 27s
Reviewed-on: #2
2025-09-18 14:28:17 +02:00
eb1f8c0e37 db ferst version is done.
All checks were successful
Backend CI / unit-test (push) Successful in 28s
Backend CI / unit-test (pull_request) Successful in 27s
2025-09-18 14:26:55 +02:00
d1f1b3e66c Add invalidate not yet with tests
All checks were successful
Backend CI / unit-test (push) Successful in 27s
2025-09-17 16:36:56 +02:00
a0898fa29e refine modle
All checks were successful
Backend CI / unit-test (push) Successful in 27s
2025-09-15 20:30:32 +02:00
7041cc654e refine teardown
All checks were successful
Backend CI / unit-test (push) Successful in 25s
2025-09-14 21:09:26 +02:00
2c22f20b48 continue on crud
Some checks failed
Backend CI / unit-test (push) Failing after 27s
2025-09-14 21:01:12 +02:00
5753ad3767 fix test teardown
All checks were successful
Backend CI / unit-test (push) Successful in 24s
2025-09-14 17:17:48 +02:00
1d215c8032 wip crud
Some checks failed
Backend CI / unit-test (push) Failing after 24s
2025-09-14 17:03:39 +02:00
479d5cd230 add user table
All checks were successful
Backend CI / unit-test (push) Successful in 48s
2025-09-14 15:40:11 +02:00
0a906535dc Revert "for better speed use slim image"
All checks were successful
Backend CI / unit-test (push) Successful in 26s
This reverts commit 1e2bfbeedb.
2025-09-13 21:33:54 +02:00
1e2bfbeedb for better speed use slim image
Some checks failed
Backend CI / unit-test (push) Has been cancelled
2025-09-13 21:29:48 +02:00
616232b76d add migration and enable ci
All checks were successful
Backend CI / unit-test (push) Successful in 1m40s
2025-09-13 21:14:14 +02:00
738df559cb add db models 2025-09-13 18:46:16 +02:00
64a2726c73 wip 2025-09-13 12:58:46 +02:00
6f6170cca4 remove almebic now 2025-09-12 23:24:46 +02:00
b56a506ede another wip 2025-09-12 21:19:36 +00:00
8e30875351 add model wip 2025-09-12 15:05:01 +00:00
aa7b2f403f Merge pull request 'feature/initial' (#1) from feature/initial into main
Reviewed-on: #1
2025-09-12 16:07:21 +02:00
db591679c9 start 2025-09-12 14:06:58 +00:00
10b3927348 Use requirements in for better production 2025-09-12 11:46:57 +00:00
36 changed files with 6869 additions and 72 deletions

125
.github/script/compare_models.py vendored Normal file
View File

@@ -0,0 +1,125 @@
import ast
import json
import sys
from pathlib import Path
# Ensure the "backend" package directory is on sys.path so `import trading_journal` works
# Find repo root by walking upwards until we find a "backend" directory.
p = Path(__file__).resolve()
repo_root = None
while True:
if (p / "backend").exists():
repo_root = p
break
if p.parent == p:
break
p = p.parent
# fallback: two levels up (covers common .github/script layout)
if repo_root is None:
repo_root = Path(__file__).resolve().parents[2]
backend_dir = repo_root / "backend"
if backend_dir.exists():
sys.path.insert(0, str(backend_dir))
def load_struct(path: Path):
src = path.read_text(encoding="utf-8")
mod = ast.parse(src)
out = {}
for node in mod.body:
if not isinstance(node, ast.ClassDef):
continue
# detect SQLModel table classes:
is_table = any(
(
kw.arg == "table"
and isinstance(kw.value, ast.Constant)
and kw.value.value is True
)
for kw in getattr(node, "keywords", [])
) or any(
getattr(b, "id", None) == "SQLModel"
or getattr(getattr(b, "attr", None), "id", None) == "SQLModel"
for b in getattr(node, "bases", [])
)
if not is_table:
continue
fields = []
for item in node.body:
# annotated assignment: name: type = value
if isinstance(item, ast.AnnAssign) and getattr(item.target, "id", None):
name = item.target.id
ann = (
ast.unparse(item.annotation)
if item.annotation is not None
else None
)
val = ast.unparse(item.value) if item.value is not None else None
fields.append((name, ann, val))
# simple assign: name = value (rare for Field, but include)
elif isinstance(item, ast.Assign):
for t in item.targets:
if getattr(t, "id", None):
name = t.id
ann = None
val = (
ast.unparse(item.value) if item.value is not None else None
)
fields.append((name, ann, val))
# sort fields by name for deterministic comparison
fields.sort(key=lambda x: x[0])
out[node.name] = fields
return out
def main():
if len(sys.argv) == 1:
print(
"usage: compare_models.py <live_model_path> [snapshot_model_path]",
file=sys.stderr,
)
sys.exit(2)
live = Path(sys.argv[1])
snap = None
if len(sys.argv) >= 3:
snap = Path(sys.argv[2])
else:
# auto-detect snapshot via db_migration.LATEST_VERSION
try:
import importlib
dbm = importlib.import_module("trading_journal.db_migration")
latest = getattr(dbm, "LATEST_VERSION")
snap = Path(live.parent) / f"models_v{latest}.py"
except Exception as e:
print("failed to determine snapshot path:", e, file=sys.stderr)
sys.exit(2)
if not live.exists() or not snap.exists():
print(
f"file missing: live={live.exists()} snap={snap.exists()}", file=sys.stderr
)
sys.exit(2)
a = load_struct(live)
b = load_struct(snap)
if a != b:
print("models mismatch\n")
diff = {
"live_only_classes": sorted(set(a) - set(b)),
"snapshot_only_classes": sorted(set(b) - set(a)),
"mismatched_classes": {},
}
for cls in set(a) & set(b):
if a[cls] != b[cls]:
diff["mismatched_classes"][cls] = {"live": a[cls], "snapshot": b[cls]}
print(json.dumps(diff, indent=2, ensure_ascii=False))
sys.exit(1)
print("models match snapshot")
sys.exit(0)
if __name__ == "__main__":
main()

34
.github/workflows/backend-ci.yml vendored Normal file
View File

@@ -0,0 +1,34 @@
name: Backend CI
on:
push:
pull_request:
workflow_dispatch:
jobs:
unit-test:
runs-on: ubuntu-latest
defaults:
run:
working-directory: backend
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.11'
- name: Install deps
run: pip install -r dev-requirements.txt
- name: Run models vs snapshot check
working-directory: ${{ github.workspace }}
run: |
python .github/script/compare_models.py backend/trading_journal/models.py
- name: Run tests
run: |
pytest -q

19
LICENSE Normal file
View File

@@ -0,0 +1,19 @@
Copyright (c) 2025 Tianyu Liu, Studio TJ
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE
OR OTHER DEALINGS IN THE SOFTWARE.

28
README.md Normal file
View File

@@ -0,0 +1,28 @@
# Trading Journal (Work In Progress)
A simple trading journal application (work in progress).
This repository contains the backend of a trading journal designed to help you record and analyse trades. The system is specially designed to support journaling trades for the "wheel" options strategy, but it also supports other trade types such as long/short spot positions, forex, and more.
Important: the project is still under active development. There is a backend in this repo, but the frontend UI has not been implemented yet.
## Key features
- Journal trades with rich metadata (strategy, entry/exit, P/L, notes).
- Built-in support and data model conveniences for the Wheel strategy (puts/calls lifecycle tracking).
- Flexible support for other trade types: long/short spots, forex, futures, etc.
- Backend-first design with tests and migration helpers.
## Repository layout
- `backend/` — Python backend code (API, models, services, migrations, tests).
- `backend/trading_journal/` — core application modules: CRUD, models, DTOs, services, and security.
- `backend/tests/` — unit tests targeting the backend logic and DB layer.
## License
See the `LICENSE` file in the project root for license details.

6
backend/.gitignore vendored
View File

@@ -11,3 +11,9 @@ venv.bak/
__pycache__/
.pytest_cache/
*.db
*.db-shm
*.db-wal
devsettings.yaml

View File

@@ -10,12 +10,17 @@
"request": "launch",
"module": "uvicorn",
"args": [
"main:app",
"app:app",
"--host=0.0.0.0",
"--reload",
"--port=5000"
"--port=18881"
],
"jinja": true,
"autoStartBrowser": true
"autoStartBrowser": false,
"env": {
"CONFIG_FILE": "devsettings.yaml"
},
"console": "integratedTerminal"
}
]
}

View File

@@ -11,5 +11,6 @@
"tests"
],
"python.testing.unittestEnabled": false,
"python.testing.pytestEnabled": true
"python.testing.pytestEnabled": true,
"python.analysis.typeCheckingMode": "standard",
}

View File

@@ -1,33 +1,324 @@
from fastapi import FastAPI
from __future__ import annotations
from models import MsgPayload
import asyncio
import logging
from contextlib import asynccontextmanager
from datetime import datetime, timezone
from typing import TYPE_CHECKING
app = FastAPI()
messages_list: dict[int, MsgPayload] = {}
from fastapi import FastAPI, HTTPException, Request, status
from fastapi.encoders import jsonable_encoder
from fastapi.responses import JSONResponse, Response
import settings
from trading_journal import db, service
from trading_journal.dto import (
CycleBase,
CycleRead,
CycleUpdate,
ExchangesBase,
ExchangesRead,
SessionsBase,
SessionsCreate,
TradeCreate,
TradeFriendlyNameUpdate,
TradeNoteUpdate,
TradeRead,
UserCreate,
UserLogin,
UserRead,
)
if TYPE_CHECKING:
from collections.abc import AsyncGenerator
from trading_journal.db import Database
_db = db.create_database(settings.settings.database_url)
logging.basicConfig(
level=logging.WARNING,
format="%(asctime)s %(levelname)s %(name)s: %(message)s",
)
logger = logging.getLogger(__name__)
@app.get("/")
def root() -> dict[str, str]:
return {"message": "Hello"}
@asynccontextmanager
async def lifespan(app: FastAPI) -> AsyncGenerator[None, None]: # noqa: ARG001
await asyncio.to_thread(_db.init_db)
try:
yield
finally:
await asyncio.to_thread(_db.dispose)
# About page route
@app.get("/about")
def about() -> dict[str, str]:
return {"message": "This is the about page."}
origins = [
"http://127.0.0.1:18881",
]
app = FastAPI(lifespan=lifespan)
app.add_middleware(
service.AuthMiddleWare,
)
app.state.db_factory = _db
# Route to add a message
@app.post("/messages/{msg_name}/")
def add_msg(msg_name: str) -> dict[str, MsgPayload]:
# Generate an ID for the item based on the highest ID in the messages_list
msg_id = max(messages_list.keys()) + 1 if messages_list else 0
messages_list[msg_id] = MsgPayload(msg_id=msg_id, msg_name=msg_name)
return {"message": messages_list[msg_id]}
@app.get(f"{settings.settings.api_base}/status")
async def get_status() -> dict[str, str]:
return {"status": "ok"}
# Route to list all messages
@app.get("/messages")
def message_items() -> dict[str, dict[int, MsgPayload]]:
return {"messages:": messages_list}
@app.post(f"{settings.settings.api_base}/register")
async def register_user(request: Request, user_in: UserCreate) -> Response:
db_factory: Database = request.app.state.db_factory
def sync_work() -> UserRead:
with db_factory.get_session_ctx_manager() as db:
return service.register_user_service(db, user_in)
try:
user = await asyncio.to_thread(sync_work)
return JSONResponse(status_code=status.HTTP_201_CREATED, content=user.model_dump())
except service.UserAlreadyExistsError as e:
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail=str(e)) from e
except Exception as e:
logger.exception("Failed to register user: \n")
raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail="Internal Server Error") from e
@app.post(f"{settings.settings.api_base}/login")
async def login(request: Request, user_in: UserLogin) -> Response:
db_factory: Database = request.app.state.db_factory
def sync_work() -> tuple[SessionsCreate, str] | None:
with db_factory.get_session_ctx_manager() as db:
return service.authenticate_user_service(db, user_in)
try:
result = await asyncio.to_thread(sync_work)
if result is None:
return JSONResponse(
status_code=status.HTTP_401_UNAUTHORIZED,
content={"detail": "Invalid username or password, or user doesn't exist"},
)
session, token = result
session_return = SessionsBase(user_id=session.user_id)
response = JSONResponse(status_code=status.HTTP_200_OK, content=session_return.model_dump())
expires_sec = int((session.expires_at.replace(tzinfo=timezone.utc) - datetime.now(timezone.utc)).total_seconds())
response.set_cookie(
key="session_token",
value=token,
httponly=True,
secure=True,
samesite="lax",
max_age=expires_sec,
path="/",
)
except Exception as e:
logger.exception("Failed to login user: \n")
raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail="Internal Server Error") from e
else:
return response
# Exchange
@app.post(f"{settings.settings.api_base}/exchanges")
async def create_exchange(request: Request, exchange_data: ExchangesBase) -> Response:
db_factory: Database = request.app.state.db_factory
def sync_work() -> ExchangesBase:
with db_factory.get_session_ctx_manager() as db:
return service.create_exchange_service(db, request.state.user_id, exchange_data.name, exchange_data.notes)
try:
exchange = await asyncio.to_thread(sync_work)
return JSONResponse(status_code=status.HTTP_201_CREATED, content=exchange.model_dump())
except service.ExchangeAlreadyExistsError as e:
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail=str(e)) from e
except Exception as e:
logger.exception("Failed to create exchange: \n")
raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail="Internal server error") from e
@app.get(f"{settings.settings.api_base}/exchanges")
async def get_exchanges(request: Request) -> list[ExchangesRead]:
db_factory: Database = request.app.state.db_factory
def sync_work() -> list[ExchangesRead]:
with db_factory.get_session_ctx_manager() as db:
return service.get_exchanges_by_user_service(db, request.state.user_id)
try:
return await asyncio.to_thread(sync_work)
except Exception as e:
logger.exception("Failed to get exchanges: \n")
raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail="Internal server error") from e
@app.patch(f"{settings.settings.api_base}/exchanges/{{exchange_id}}")
async def update_exchange(request: Request, exchange_id: int, exchange_data: ExchangesBase) -> Response:
db_factory: Database = request.app.state.db_factory
def sync_work() -> ExchangesBase:
with db_factory.get_session_ctx_manager() as db:
return service.update_exchanges_service(db, request.state.user_id, exchange_id, exchange_data.name, exchange_data.notes)
try:
exchange = await asyncio.to_thread(sync_work)
return JSONResponse(status_code=status.HTTP_200_OK, content=exchange.model_dump())
except service.ExchangeNotFoundError as e:
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=str(e)) from e
except service.ExchangeAlreadyExistsError as e:
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail=str(e)) from e
except Exception as e:
logger.exception("Failed to update exchange: \n")
raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail="Internal server error") from e
# Cycle
@app.post(f"{settings.settings.api_base}/cycles")
async def create_cycle(request: Request, cycle_data: CycleBase) -> Response:
return JSONResponse(status_code=status.HTTP_405_METHOD_NOT_ALLOWED, content="Not supported.")
db_factory: Database = request.app.state.db_factory
def sync_work() -> CycleBase:
with db_factory.get_session_ctx_manager() as db:
return service.create_cycle_service(db, request.state.user_id, cycle_data)
try:
cycle = await asyncio.to_thread(sync_work)
return JSONResponse(status_code=status.HTTP_201_CREATED, content=jsonable_encoder(cycle))
except Exception as e:
logger.exception("Failed to create cycle: \n")
raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail="Internal server error") from e
@app.get(f"{settings.settings.api_base}/cycles/{{cycle_id}}")
async def get_cycle_by_id(request: Request, cycle_id: int) -> Response:
db_factory: Database = request.app.state.db_factory
def sync_work() -> CycleBase:
with db_factory.get_session_ctx_manager() as db:
return service.get_cycle_by_id_service(db, request.state.user_id, cycle_id)
try:
cycle = await asyncio.to_thread(sync_work)
return JSONResponse(status_code=status.HTTP_200_OK, content=jsonable_encoder(cycle))
except service.CycleNotFoundError as e:
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=str(e)) from e
except Exception as e:
logger.exception("Failed to get cycle by id: \n")
raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail="Internal server error") from e
@app.get(f"{settings.settings.api_base}/cycles/user/{{user_id}}")
async def get_cycles_by_user(request: Request, user_id: int) -> Response:
db_factory: Database = request.app.state.db_factory
def sync_work() -> list[CycleRead]:
with db_factory.get_session_ctx_manager() as db:
return service.get_cycles_by_user_service(db, user_id)
try:
cycles = await asyncio.to_thread(sync_work)
return JSONResponse(status_code=status.HTTP_200_OK, content=jsonable_encoder(cycles))
except Exception as e:
logger.exception("Failed to get cycles by user: \n")
raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail="Internal server error") from e
@app.patch(f"{settings.settings.api_base}/cycles")
async def update_cycle(request: Request, cycle_data: CycleUpdate) -> Response:
db_factory: Database = request.app.state.db_factory
def sync_work() -> CycleRead:
with db_factory.get_session_ctx_manager() as db:
return service.update_cycle_service(db, request.state.user_id, cycle_data)
try:
cycle = await asyncio.to_thread(sync_work)
return JSONResponse(status_code=status.HTTP_200_OK, content=jsonable_encoder(cycle))
except service.InvalidCycleDataError as e:
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail=str(e)) from e
except service.CycleNotFoundError as e:
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=str(e)) from e
except Exception as e:
logger.exception("Failed to update cycle: \n")
raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail="Internal server error") from e
@app.post(f"{settings.settings.api_base}/trades")
async def create_trade(request: Request, trade_data: TradeCreate) -> Response:
db_factory: Database = request.app.state.db_factory
def sync_work() -> TradeRead:
with db_factory.get_session_ctx_manager() as db:
return service.create_trade_service(db, request.state.user_id, trade_data)
try:
trade = await asyncio.to_thread(sync_work)
return JSONResponse(status_code=status.HTTP_201_CREATED, content=jsonable_encoder(trade))
except service.InvalidTradeDataError as e:
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail=str(e)) from e
except Exception as e:
logger.exception("Failed to create trade: \n")
raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail="Internal server error") from e
@app.get(f"{settings.settings.api_base}/trades/{{trade_id}}")
async def get_trade_by_id(request: Request, trade_id: int) -> Response:
db_factory: Database = request.app.state.db_factory
def sync_work() -> TradeRead:
with db_factory.get_session_ctx_manager() as db:
return service.get_trade_by_id_service(db, request.state.user_id, trade_id)
try:
trade = await asyncio.to_thread(sync_work)
return JSONResponse(status_code=status.HTTP_200_OK, content=jsonable_encoder(trade))
except service.TradeNotFoundError as e:
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=str(e)) from e
except Exception as e:
logger.exception("Failed to get trade by id: \n")
raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail="Internal server error") from e
@app.patch(f"{settings.settings.api_base}/trades/friendlyname")
async def update_trade_friendly_name(request: Request, friendly_name_update: TradeFriendlyNameUpdate) -> Response:
db_factory: Database = request.app.state.db_factory
def sync_work() -> TradeRead:
with db_factory.get_session_ctx_manager() as db:
return service.update_trade_friendly_name_service(
db,
request.state.user_id,
friendly_name_update.id,
friendly_name_update.friendly_name,
)
try:
trade = await asyncio.to_thread(sync_work)
return JSONResponse(status_code=status.HTTP_200_OK, content=jsonable_encoder(trade))
except service.TradeNotFoundError as e:
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=str(e)) from e
except Exception as e:
logger.exception("Failed to update trade friendly name: \n")
raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail="Internal server error") from e
@app.patch(f"{settings.settings.api_base}/trades/notes")
async def update_trade_note(request: Request, note_update: TradeNoteUpdate) -> Response:
db_factory: Database = request.app.state.db_factory
def sync_work() -> TradeRead:
with db_factory.get_session_ctx_manager() as db:
return service.update_trade_note_service(db, request.state.user_id, note_update.id, note_update.notes)
try:
trade = await asyncio.to_thread(sync_work)
return JSONResponse(status_code=status.HTTP_200_OK, content=jsonable_encoder(trade))
except service.TradeNotFoundError as e:
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=str(e)) from e
except Exception as e:
logger.exception("Failed to update trade note: \n")
raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail="Internal server error") from e

View File

@@ -0,0 +1,2 @@
-r requirements.in
pytest

View File

@@ -1,2 +1,511 @@
-r requirements.txt
pytest
#
# This file is autogenerated by pip-compile with Python 3.11
# by the following command:
#
# pip-compile --generate-hashes dev-requirements.in
#
annotated-types==0.7.0 \
--hash=sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53 \
--hash=sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89
# via pydantic
anyio==4.10.0 \
--hash=sha256:3f3fae35c96039744587aa5b8371e7e8e603c0702999535961dd336026973ba6 \
--hash=sha256:60e474ac86736bbfd6f210f7a61218939c318f43f9972497381f1c5e930ed3d1
# via
# httpx
# starlette
argon2-cffi==25.1.0 \
--hash=sha256:694ae5cc8a42f4c4e2bf2ca0e64e51e23a040c6a517a85074683d3959e1346c1 \
--hash=sha256:fdc8b074db390fccb6eb4a3604ae7231f219aa669a2652e0f20e16ba513d5741
# via -r requirements.in
argon2-cffi-bindings==25.1.0 \
--hash=sha256:1db89609c06afa1a214a69a462ea741cf735b29a57530478c06eb81dd403de99 \
--hash=sha256:1e021e87faa76ae0d413b619fe2b65ab9a037f24c60a1e6cc43457ae20de6dc6 \
--hash=sha256:21378b40e1b8d1655dd5310c84a40fc19a9aa5e6366e835ceb8576bf0fea716d \
--hash=sha256:2630b6240b495dfab90aebe159ff784d08ea999aa4b0d17efa734055a07d2f44 \
--hash=sha256:3c6702abc36bf3ccba3f802b799505def420a1b7039862014a65db3205967f5a \
--hash=sha256:3d3f05610594151994ca9ccb3c771115bdb4daef161976a266f0dd8aa9996b8f \
--hash=sha256:473bcb5f82924b1becbb637b63303ec8d10e84c8d241119419897a26116515d2 \
--hash=sha256:5acb4e41090d53f17ca1110c3427f0a130f944b896fc8c83973219c97f57b690 \
--hash=sha256:5d588dec224e2a83edbdc785a5e6f3c6cd736f46bfd4b441bbb5aa1f5085e584 \
--hash=sha256:6dca33a9859abf613e22733131fc9194091c1fa7cb3e131c143056b4856aa47e \
--hash=sha256:7aef0c91e2c0fbca6fc68e7555aa60ef7008a739cbe045541e438373bc54d2b0 \
--hash=sha256:84a461d4d84ae1295871329b346a97f68eade8c53b6ed9a7ca2d7467f3c8ff6f \
--hash=sha256:87c33a52407e4c41f3b70a9c2d3f6056d88b10dad7695be708c5021673f55623 \
--hash=sha256:8b8efee945193e667a396cbc7b4fb7d357297d6234d30a489905d96caabde56b \
--hash=sha256:a1c70058c6ab1e352304ac7e3b52554daadacd8d453c1752e547c76e9c99ac44 \
--hash=sha256:a98cd7d17e9f7ce244c0803cad3c23a7d379c301ba618a5fa76a67d116618b98 \
--hash=sha256:aecba1723ae35330a008418a91ea6cfcedf6d31e5fbaa056a166462ff066d500 \
--hash=sha256:b0fdbcf513833809c882823f98dc2f931cf659d9a1429616ac3adebb49f5db94 \
--hash=sha256:b55aec3565b65f56455eebc9b9f34130440404f27fe21c3b375bf1ea4d8fbae6 \
--hash=sha256:b957f3e6ea4d55d820e40ff76f450952807013d361a65d7f28acc0acbf29229d \
--hash=sha256:ba92837e4a9aa6a508c8d2d7883ed5a8f6c308c89a4790e1e447a220deb79a85 \
--hash=sha256:c4f9665de60b1b0e99bcd6be4f17d90339698ce954cfd8d9cf4f91c995165a92 \
--hash=sha256:c87b72589133f0346a1cb8d5ecca4b933e3c9b64656c9d175270a000e73b288d \
--hash=sha256:d3e924cfc503018a714f94a49a149fdc0b644eaead5d1f089330399134fa028a \
--hash=sha256:da0c79c23a63723aa5d782250fbf51b768abca630285262fb5144ba5ae01e520 \
--hash=sha256:e2fd3bfbff3c5d74fef31a722f729bf93500910db650c925c2d6ef879a7e51cb
# via argon2-cffi
certifi==2025.8.3 \
--hash=sha256:e564105f78ded564e3ae7c923924435e1daa7463faeab5bb932bc53ffae63407 \
--hash=sha256:f6c12493cfb1b06ba2ff328595af9350c65d6644968e5d3a2ffd78699af217a5
# via
# httpcore
# httpx
cffi==2.0.0 \
--hash=sha256:00bdf7acc5f795150faa6957054fbbca2439db2f775ce831222b66f192f03beb \
--hash=sha256:07b271772c100085dd28b74fa0cd81c8fb1a3ba18b21e03d7c27f3436a10606b \
--hash=sha256:087067fa8953339c723661eda6b54bc98c5625757ea62e95eb4898ad5e776e9f \
--hash=sha256:0a1527a803f0a659de1af2e1fd700213caba79377e27e4693648c2923da066f9 \
--hash=sha256:0cf2d91ecc3fcc0625c2c530fe004f82c110405f101548512cce44322fa8ac44 \
--hash=sha256:0f6084a0ea23d05d20c3edcda20c3d006f9b6f3fefeac38f59262e10cef47ee2 \
--hash=sha256:12873ca6cb9b0f0d3a0da705d6086fe911591737a59f28b7936bdfed27c0d47c \
--hash=sha256:19f705ada2530c1167abacb171925dd886168931e0a7b78f5bffcae5c6b5be75 \
--hash=sha256:1cd13c99ce269b3ed80b417dcd591415d3372bcac067009b6e0f59c7d4015e65 \
--hash=sha256:1e3a615586f05fc4065a8b22b8152f0c1b00cdbc60596d187c2a74f9e3036e4e \
--hash=sha256:1f72fb8906754ac8a2cc3f9f5aaa298070652a0ffae577e0ea9bd480dc3c931a \
--hash=sha256:1fc9ea04857caf665289b7a75923f2c6ed559b8298a1b8c49e59f7dd95c8481e \
--hash=sha256:203a48d1fb583fc7d78a4c6655692963b860a417c0528492a6bc21f1aaefab25 \
--hash=sha256:2081580ebb843f759b9f617314a24ed5738c51d2aee65d31e02f6f7a2b97707a \
--hash=sha256:21d1152871b019407d8ac3985f6775c079416c282e431a4da6afe7aefd2bccbe \
--hash=sha256:24b6f81f1983e6df8db3adc38562c83f7d4a0c36162885ec7f7b77c7dcbec97b \
--hash=sha256:256f80b80ca3853f90c21b23ee78cd008713787b1b1e93eae9f3d6a7134abd91 \
--hash=sha256:28a3a209b96630bca57cce802da70c266eb08c6e97e5afd61a75611ee6c64592 \
--hash=sha256:2c8f814d84194c9ea681642fd164267891702542f028a15fc97d4674b6206187 \
--hash=sha256:2de9a304e27f7596cd03d16f1b7c72219bd944e99cc52b84d0145aefb07cbd3c \
--hash=sha256:38100abb9d1b1435bc4cc340bb4489635dc2f0da7456590877030c9b3d40b0c1 \
--hash=sha256:3925dd22fa2b7699ed2617149842d2e6adde22b262fcbfada50e3d195e4b3a94 \
--hash=sha256:3e17ed538242334bf70832644a32a7aae3d83b57567f9fd60a26257e992b79ba \
--hash=sha256:3e837e369566884707ddaf85fc1744b47575005c0a229de3327f8f9a20f4efeb \
--hash=sha256:3f4d46d8b35698056ec29bca21546e1551a205058ae1a181d871e278b0b28165 \
--hash=sha256:44d1b5909021139fe36001ae048dbdde8214afa20200eda0f64c068cac5d5529 \
--hash=sha256:45d5e886156860dc35862657e1494b9bae8dfa63bf56796f2fb56e1679fc0bca \
--hash=sha256:4647afc2f90d1ddd33441e5b0e85b16b12ddec4fca55f0d9671fef036ecca27c \
--hash=sha256:4671d9dd5ec934cb9a73e7ee9676f9362aba54f7f34910956b84d727b0d73fb6 \
--hash=sha256:53f77cbe57044e88bbd5ed26ac1d0514d2acf0591dd6bb02a3ae37f76811b80c \
--hash=sha256:5eda85d6d1879e692d546a078b44251cdd08dd1cfb98dfb77b670c97cee49ea0 \
--hash=sha256:5fed36fccc0612a53f1d4d9a816b50a36702c28a2aa880cb8a122b3466638743 \
--hash=sha256:61d028e90346df14fedc3d1e5441df818d095f3b87d286825dfcbd6459b7ef63 \
--hash=sha256:66f011380d0e49ed280c789fbd08ff0d40968ee7b665575489afa95c98196ab5 \
--hash=sha256:6824f87845e3396029f3820c206e459ccc91760e8fa24422f8b0c3d1731cbec5 \
--hash=sha256:6c6c373cfc5c83a975506110d17457138c8c63016b563cc9ed6e056a82f13ce4 \
--hash=sha256:6d02d6655b0e54f54c4ef0b94eb6be0607b70853c45ce98bd278dc7de718be5d \
--hash=sha256:6d50360be4546678fc1b79ffe7a66265e28667840010348dd69a314145807a1b \
--hash=sha256:730cacb21e1bdff3ce90babf007d0a0917cc3e6492f336c2f0134101e0944f93 \
--hash=sha256:737fe7d37e1a1bffe70bd5754ea763a62a066dc5913ca57e957824b72a85e205 \
--hash=sha256:74a03b9698e198d47562765773b4a8309919089150a0bb17d829ad7b44b60d27 \
--hash=sha256:7553fb2090d71822f02c629afe6042c299edf91ba1bf94951165613553984512 \
--hash=sha256:7a66c7204d8869299919db4d5069a82f1561581af12b11b3c9f48c584eb8743d \
--hash=sha256:7cc09976e8b56f8cebd752f7113ad07752461f48a58cbba644139015ac24954c \
--hash=sha256:81afed14892743bbe14dacb9e36d9e0e504cd204e0b165062c488942b9718037 \
--hash=sha256:8941aaadaf67246224cee8c3803777eed332a19d909b47e29c9842ef1e79ac26 \
--hash=sha256:89472c9762729b5ae1ad974b777416bfda4ac5642423fa93bd57a09204712322 \
--hash=sha256:8ea985900c5c95ce9db1745f7933eeef5d314f0565b27625d9a10ec9881e1bfb \
--hash=sha256:8eca2a813c1cb7ad4fb74d368c2ffbbb4789d377ee5bb8df98373c2cc0dee76c \
--hash=sha256:92b68146a71df78564e4ef48af17551a5ddd142e5190cdf2c5624d0c3ff5b2e8 \
--hash=sha256:9332088d75dc3241c702d852d4671613136d90fa6881da7d770a483fd05248b4 \
--hash=sha256:94698a9c5f91f9d138526b48fe26a199609544591f859c870d477351dc7b2414 \
--hash=sha256:9a67fc9e8eb39039280526379fb3a70023d77caec1852002b4da7e8b270c4dd9 \
--hash=sha256:9de40a7b0323d889cf8d23d1ef214f565ab154443c42737dfe52ff82cf857664 \
--hash=sha256:a05d0c237b3349096d3981b727493e22147f934b20f6f125a3eba8f994bec4a9 \
--hash=sha256:afb8db5439b81cf9c9d0c80404b60c3cc9c3add93e114dcae767f1477cb53775 \
--hash=sha256:b18a3ed7d5b3bd8d9ef7a8cb226502c6bf8308df1525e1cc676c3680e7176739 \
--hash=sha256:b1e74d11748e7e98e2f426ab176d4ed720a64412b6a15054378afdb71e0f37dc \
--hash=sha256:b21e08af67b8a103c71a250401c78d5e0893beff75e28c53c98f4de42f774062 \
--hash=sha256:b4c854ef3adc177950a8dfc81a86f5115d2abd545751a304c5bcf2c2c7283cfe \
--hash=sha256:b882b3df248017dba09d6b16defe9b5c407fe32fc7c65a9c69798e6175601be9 \
--hash=sha256:baf5215e0ab74c16e2dd324e8ec067ef59e41125d3eade2b863d294fd5035c92 \
--hash=sha256:c649e3a33450ec82378822b3dad03cc228b8f5963c0c12fc3b1e0ab940f768a5 \
--hash=sha256:c654de545946e0db659b3400168c9ad31b5d29593291482c43e3564effbcee13 \
--hash=sha256:c6638687455baf640e37344fe26d37c404db8b80d037c3d29f58fe8d1c3b194d \
--hash=sha256:c8d3b5532fc71b7a77c09192b4a5a200ea992702734a2e9279a37f2478236f26 \
--hash=sha256:cb527a79772e5ef98fb1d700678fe031e353e765d1ca2d409c92263c6d43e09f \
--hash=sha256:cf364028c016c03078a23b503f02058f1814320a56ad535686f90565636a9495 \
--hash=sha256:d48a880098c96020b02d5a1f7d9251308510ce8858940e6fa99ece33f610838b \
--hash=sha256:d68b6cef7827e8641e8ef16f4494edda8b36104d79773a334beaa1e3521430f6 \
--hash=sha256:d9b29c1f0ae438d5ee9acb31cadee00a58c46cc9c0b2f9038c6b0b3470877a8c \
--hash=sha256:d9b97165e8aed9272a6bb17c01e3cc5871a594a446ebedc996e2397a1c1ea8ef \
--hash=sha256:da68248800ad6320861f129cd9c1bf96ca849a2771a59e0344e88681905916f5 \
--hash=sha256:da902562c3e9c550df360bfa53c035b2f241fed6d9aef119048073680ace4a18 \
--hash=sha256:dbd5c7a25a7cb98f5ca55d258b103a2054f859a46ae11aaf23134f9cc0d356ad \
--hash=sha256:dd4f05f54a52fb558f1ba9f528228066954fee3ebe629fc1660d874d040ae5a3 \
--hash=sha256:de8dad4425a6ca6e4e5e297b27b5c824ecc7581910bf9aee86cb6835e6812aa7 \
--hash=sha256:e11e82b744887154b182fd3e7e8512418446501191994dbf9c9fc1f32cc8efd5 \
--hash=sha256:e6e73b9e02893c764e7e8d5bb5ce277f1a009cd5243f8228f75f842bf937c534 \
--hash=sha256:f73b96c41e3b2adedc34a7356e64c8eb96e03a3782b535e043a986276ce12a49 \
--hash=sha256:f93fd8e5c8c0a4aa1f424d6173f14a892044054871c771f8566e4008eaa359d2 \
--hash=sha256:fc33c5141b55ed366cfaad382df24fe7dcbc686de5be719b207bb248e3053dc5 \
--hash=sha256:fc7de24befaeae77ba923797c7c87834c73648a05a4bde34b3b7e5588973a453 \
--hash=sha256:fe562eb1a64e67dd297ccc4f5addea2501664954f2692b69a76449ec7913ecbf
# via argon2-cffi-bindings
click==8.2.1 \
--hash=sha256:27c491cc05d968d271d5a1db13e3b5a184636d9d930f148c50b038f0d0646202 \
--hash=sha256:61a3265b914e850b85317d0b3109c7f8cd35a670f963866005d6ef1d5175a12b
# via uvicorn
fastapi==0.116.1 \
--hash=sha256:c46ac7c312df840f0c9e220f7964bada936781bc4e2e6eb71f1c4d7553786565 \
--hash=sha256:ed52cbf946abfd70c5a0dccb24673f0670deeb517a88b3544d03c2a6bf283143
# via -r requirements.in
greenlet==3.2.4 \
--hash=sha256:00fadb3fedccc447f517ee0d3fd8fe49eae949e1cd0f6a611818f4f6fb7dc83b \
--hash=sha256:061dc4cf2c34852b052a8620d40f36324554bc192be474b9e9770e8c042fd735 \
--hash=sha256:0db5594dce18db94f7d1650d7489909b57afde4c580806b8d9203b6e79cdc079 \
--hash=sha256:0dca0d95ff849f9a364385f36ab49f50065d76964944638be9691e1832e9f86d \
--hash=sha256:16458c245a38991aa19676900d48bd1a6f2ce3e16595051a4db9d012154e8433 \
--hash=sha256:18d9260df2b5fbf41ae5139e1be4e796d99655f023a636cd0e11e6406cca7d58 \
--hash=sha256:1987de92fec508535687fb807a5cea1560f6196285a4cde35c100b8cd632cc52 \
--hash=sha256:1a921e542453fe531144e91e1feedf12e07351b1cf6c9e8a3325ea600a715a31 \
--hash=sha256:1ee8fae0519a337f2329cb78bd7a8e128ec0f881073d43f023c7b8d4831d5246 \
--hash=sha256:20fb936b4652b6e307b8f347665e2c615540d4b42b3b4c8a321d8286da7e520f \
--hash=sha256:23768528f2911bcd7e475210822ffb5254ed10d71f4028387e5a99b4c6699671 \
--hash=sha256:2523e5246274f54fdadbce8494458a2ebdcdbc7b802318466ac5606d3cded1f8 \
--hash=sha256:27890167f55d2387576d1f41d9487ef171849ea0359ce1510ca6e06c8bece11d \
--hash=sha256:299fd615cd8fc86267b47597123e3f43ad79c9d8a22bebdce535e53550763e2f \
--hash=sha256:3b3812d8d0c9579967815af437d96623f45c0f2ae5f04e366de62a12d83a8fb0 \
--hash=sha256:3b67ca49f54cede0186854a008109d6ee71f66bd57bb36abd6d0a0267b540cdd \
--hash=sha256:44358b9bf66c8576a9f57a590d5f5d6e72fa4228b763d0e43fee6d3b06d3a337 \
--hash=sha256:49a30d5fda2507ae77be16479bdb62a660fa51b1eb4928b524975b3bde77b3c0 \
--hash=sha256:4d1378601b85e2e5171b99be8d2dc85f594c79967599328f95c1dc1a40f1c633 \
--hash=sha256:554b03b6e73aaabec3745364d6239e9e012d64c68ccd0b8430c64ccc14939a8b \
--hash=sha256:55e9c5affaa6775e2c6b67659f3a71684de4c549b3dd9afca3bc773533d284fa \
--hash=sha256:58b97143c9cc7b86fc458f215bd0932f1757ce649e05b640fea2e79b54cedb31 \
--hash=sha256:5c9320971821a7cb77cfab8d956fa8e39cd07ca44b6070db358ceb7f8797c8c9 \
--hash=sha256:65458b409c1ed459ea899e939f0e1cdb14f58dbc803f2f93c5eab5694d32671b \
--hash=sha256:671df96c1f23c4a0d4077a325483c1503c96a1b7d9db26592ae770daa41233d4 \
--hash=sha256:710638eb93b1fa52823aa91bf75326f9ecdfd5e0466f00789246a5280f4ba0fc \
--hash=sha256:73f49b5368b5359d04e18d15828eecc1806033db5233397748f4ca813ff1056c \
--hash=sha256:81701fd84f26330f0d5f4944d4e92e61afe6319dcd9775e39396e39d7c3e5f98 \
--hash=sha256:8854167e06950ca75b898b104b63cc646573aa5fef1353d4508ecdd1ee76254f \
--hash=sha256:8c68325b0d0acf8d91dde4e6f930967dd52a5302cd4062932a6b2e7c2969f47c \
--hash=sha256:94385f101946790ae13da500603491f04a76b6e4c059dab271b3ce2e283b2590 \
--hash=sha256:94abf90142c2a18151632371140b3dba4dee031633fe614cb592dbb6c9e17bc3 \
--hash=sha256:96378df1de302bc38e99c3a9aa311967b7dc80ced1dcc6f171e99842987882a2 \
--hash=sha256:9c40adce87eaa9ddb593ccb0fa6a07caf34015a29bf8d344811665b573138db9 \
--hash=sha256:9fe0a28a7b952a21e2c062cd5756d34354117796c6d9215a87f55e38d15402c5 \
--hash=sha256:a7d4e128405eea3814a12cc2605e0e6aedb4035bf32697f72deca74de4105e02 \
--hash=sha256:abbf57b5a870d30c4675928c37278493044d7c14378350b3aa5d484fa65575f0 \
--hash=sha256:b4a1870c51720687af7fa3e7cda6d08d801dae660f75a76f3845b642b4da6ee1 \
--hash=sha256:b6a7c19cf0d2742d0809a4c05975db036fdff50cd294a93632d6a310bf9ac02c \
--hash=sha256:b90654e092f928f110e0007f572007c9727b5265f7632c2fa7415b4689351594 \
--hash=sha256:c17b6b34111ea72fc5a4e4beec9711d2226285f0386ea83477cbb97c30a3f3a5 \
--hash=sha256:c2ca18a03a8cfb5b25bc1cbe20f3d9a4c80d8c3b13ba3df49ac3961af0b1018d \
--hash=sha256:c5111ccdc9c88f423426df3fd1811bfc40ed66264d35aa373420a34377efc98a \
--hash=sha256:c60a6d84229b271d44b70fb6e5fa23781abb5d742af7b808ae3f6efd7c9c60f6 \
--hash=sha256:c8c9e331e58180d0d83c5b7999255721b725913ff6bc6cf39fa2a45841a4fd4b \
--hash=sha256:c9913f1a30e4526f432991f89ae263459b1c64d1608c0d22a5c79c287b3c70df \
--hash=sha256:cd3c8e693bff0fff6ba55f140bf390fa92c994083f838fece0f63be121334945 \
--hash=sha256:d25c5091190f2dc0eaa3f950252122edbbadbb682aa7b1ef2f8af0f8c0afefae \
--hash=sha256:d2e685ade4dafd447ede19c31277a224a239a0a1a4eca4e6390efedf20260cfb \
--hash=sha256:d76383238584e9711e20ebe14db6c88ddcedc1829a9ad31a584389463b5aa504 \
--hash=sha256:ddf9164e7a5b08e9d22511526865780a576f19ddd00d62f8a665949327fde8bb \
--hash=sha256:e37ab26028f12dbb0ff65f29a8d3d44a765c61e729647bf2ddfbbed621726f01 \
--hash=sha256:f10fd42b5ee276335863712fa3da6608e93f70629c631bf77145021600abc23c \
--hash=sha256:f28588772bb5fb869a8eb331374ec06f24a83a9c25bfa1f38b6993afe9c1e968
# via sqlalchemy
h11==0.16.0 \
--hash=sha256:4e35b956cf45792e4caa5885e69fba00bdbc6ffafbfa020300e549b208ee5ff1 \
--hash=sha256:63cf8bbe7522de3bf65932fda1d9c2772064ffb3dae62d55932da54b31cb6c86
# via
# httpcore
# uvicorn
httpcore==1.0.9 \
--hash=sha256:2d400746a40668fc9dec9810239072b40b4484b640a8c38fd654a024c7a1bf55 \
--hash=sha256:6e34463af53fd2ab5d807f399a9b45ea31c3dfa2276f15a2c3f00afff6e176e8
# via httpx
httpx==0.28.1 \
--hash=sha256:75e98c5f16b0f35b567856f597f06ff2270a374470a5c2392242528e3e3e42fc \
--hash=sha256:d909fcccc110f8c7faf814ca82a9a4d816bc5a6dbfea25d6591d6985b8ba59ad
# via -r requirements.in
idna==3.10 \
--hash=sha256:12f65c9b470abda6dc35cf8e63cc574b1c52b11df2c86030af0ac09b01b13ea9 \
--hash=sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3
# via
# anyio
# httpx
iniconfig==2.1.0 \
--hash=sha256:3abbd2e30b36733fee78f9c7f7308f2d0050e88f0087fd25c2645f63c773e1c7 \
--hash=sha256:9deba5723312380e77435581c6bf4935c94cbfab9b1ed33ef8d238ea168eb760
# via pytest
packaging==25.0 \
--hash=sha256:29572ef2b1f17581046b3a2227d5c611fb25ec70ca1ba8554b24b0e69331a484 \
--hash=sha256:d443872c98d677bf60f6a1f2f8c1cb748e8fe762d2bf9d3148b5599295b0fc4f
# via pytest
pluggy==1.6.0 \
--hash=sha256:7dcc130b76258d33b90f61b658791dede3486c3e6bfb003ee5c9bfb396dd22f3 \
--hash=sha256:e920276dd6813095e9377c0bc5566d94c932c33b27a3e3945d8389c374dd4746
# via pytest
pycparser==2.23 \
--hash=sha256:78816d4f24add8f10a06d6f05b4d424ad9e96cfebf68a4ddc99c65c0720d00c2 \
--hash=sha256:e5c6e8d3fbad53479cab09ac03729e0a9faf2bee3db8208a550daf5af81a5934
# via cffi
pydantic==2.11.7 \
--hash=sha256:d989c3c6cb79469287b1569f7447a17848c998458d49ebe294e975b9baf0f0db \
--hash=sha256:dde5df002701f6de26248661f6835bbe296a47bf73990135c7d07ce741b9623b
# via
# fastapi
# pydantic-settings
# sqlmodel
pydantic-core==2.33.2 \
--hash=sha256:0069c9acc3f3981b9ff4cdfaf088e98d83440a4c7ea1bc07460af3d4dc22e72d \
--hash=sha256:031c57d67ca86902726e0fae2214ce6770bbe2f710dc33063187a68744a5ecac \
--hash=sha256:0405262705a123b7ce9f0b92f123334d67b70fd1f20a9372b907ce1080c7ba02 \
--hash=sha256:04a1a413977ab517154eebb2d326da71638271477d6ad87a769102f7c2488c56 \
--hash=sha256:09fb9dd6571aacd023fe6aaca316bd01cf60ab27240d7eb39ebd66a3a15293b4 \
--hash=sha256:0a39979dcbb70998b0e505fb1556a1d550a0781463ce84ebf915ba293ccb7e22 \
--hash=sha256:0a9f2c9dd19656823cb8250b0724ee9c60a82f3cdf68a080979d13092a3b0fef \
--hash=sha256:0e03262ab796d986f978f79c943fc5f620381be7287148b8010b4097f79a39ec \
--hash=sha256:0e5b2671f05ba48b94cb90ce55d8bdcaaedb8ba00cc5359f6810fc918713983d \
--hash=sha256:0e6116757f7959a712db11f3e9c0a99ade00a5bbedae83cb801985aa154f071b \
--hash=sha256:0fb2d542b4d66f9470e8065c5469ec676978d625a8b7a363f07d9a501a9cb36a \
--hash=sha256:1082dd3e2d7109ad8b7da48e1d4710c8d06c253cbc4a27c1cff4fbcaa97a9e3f \
--hash=sha256:1a8695a8d00c73e50bff9dfda4d540b7dee29ff9b8053e38380426a85ef10052 \
--hash=sha256:1e063337ef9e9820c77acc768546325ebe04ee38b08703244c1309cccc4f1bab \
--hash=sha256:1ea40a64d23faa25e62a70ad163571c0b342b8bf66d5fa612ac0dec4f069d916 \
--hash=sha256:2058a32994f1fde4ca0480ab9d1e75a0e8c87c22b53a3ae66554f9af78f2fe8c \
--hash=sha256:235f45e5dbcccf6bd99f9f472858849f73d11120d76ea8707115415f8e5ebebf \
--hash=sha256:2807668ba86cb38c6817ad9bc66215ab8584d1d304030ce4f0887336f28a5e27 \
--hash=sha256:2b0a451c263b01acebe51895bfb0e1cc842a5c666efe06cdf13846c7418caa9a \
--hash=sha256:2b3d326aaef0c0399d9afffeb6367d5e26ddc24d351dbc9c636840ac355dc5d8 \
--hash=sha256:2bfb5112df54209d820d7bf9317c7a6c9025ea52e49f46b6a2060104bba37de7 \
--hash=sha256:2f82865531efd18d6e07a04a17331af02cb7a651583c418df8266f17a63c6612 \
--hash=sha256:329467cecfb529c925cf2bbd4d60d2c509bc2fb52a20c1045bf09bb70971a9c1 \
--hash=sha256:3a1c81334778f9e3af2f8aeb7a960736e5cab1dfebfb26aabca09afd2906c039 \
--hash=sha256:3abcd9392a36025e3bd55f9bd38d908bd17962cc49bc6da8e7e96285336e2bca \
--hash=sha256:3c6db6e52c6d70aa0d00d45cdb9b40f0433b96380071ea80b09277dba021ddf7 \
--hash=sha256:3dc625f4aa79713512d1976fe9f0bc99f706a9dee21dfd1810b4bbbf228d0e8a \
--hash=sha256:3eb3fe62804e8f859c49ed20a8451342de53ed764150cb14ca71357c765dc2a6 \
--hash=sha256:44857c3227d3fb5e753d5fe4a3420d6376fa594b07b621e220cd93703fe21782 \
--hash=sha256:4b25d91e288e2c4e0662b8038a28c6a07eaac3e196cfc4ff69de4ea3db992a1b \
--hash=sha256:4c5b0a576fb381edd6d27f0a85915c6daf2f8138dc5c267a57c08a62900758c7 \
--hash=sha256:4e61206137cbc65e6d5256e1166f88331d3b6238e082d9f74613b9b765fb9025 \
--hash=sha256:52fb90784e0a242bb96ec53f42196a17278855b0f31ac7c3cc6f5c1ec4811849 \
--hash=sha256:53a57d2ed685940a504248187d5685e49eb5eef0f696853647bf37c418c538f7 \
--hash=sha256:572c7e6c8bb4774d2ac88929e3d1f12bc45714ae5ee6d9a788a9fb35e60bb04b \
--hash=sha256:5c4aa4e82353f65e548c476b37e64189783aa5384903bfea4f41580f255fddfa \
--hash=sha256:5c92edd15cd58b3c2d34873597a1e20f13094f59cf88068adb18947df5455b4e \
--hash=sha256:5f483cfb75ff703095c59e365360cb73e00185e01aaea067cd19acffd2ab20ea \
--hash=sha256:61c18fba8e5e9db3ab908620af374db0ac1baa69f0f32df4f61ae23f15e586ac \
--hash=sha256:6368900c2d3ef09b69cb0b913f9f8263b03786e5b2a387706c5afb66800efd51 \
--hash=sha256:64632ff9d614e5eecfb495796ad51b0ed98c453e447a76bcbeeb69615079fc7e \
--hash=sha256:65132b7b4a1c0beded5e057324b7e16e10910c106d43675d9bd87d4f38dde162 \
--hash=sha256:6b99022f1d19bc32a4c2a0d544fc9a76e3be90f0b3f4af413f87d38749300e65 \
--hash=sha256:6bdfe4b3789761f3bcb4b1ddf33355a71079858958e3a552f16d5af19768fef2 \
--hash=sha256:6fa6dfc3e4d1f734a34710f391ae822e0a8eb8559a85c6979e14e65ee6ba2954 \
--hash=sha256:73662edf539e72a9440129f231ed3757faab89630d291b784ca99237fb94db2b \
--hash=sha256:73cf6373c21bc80b2e0dc88444f41ae60b2f070ed02095754eb5a01df12256de \
--hash=sha256:7cb8bc3605c29176e1b105350d2e6474142d7c1bd1d9327c4a9bdb46bf827acc \
--hash=sha256:7f92c15cd1e97d4b12acd1cc9004fa092578acfa57b67ad5e43a197175d01a64 \
--hash=sha256:82f68293f055f51b51ea42fafc74b6aad03e70e191799430b90c13d643059ebb \
--hash=sha256:83aa99b1285bc8f038941ddf598501a86f1536789740991d7d8756e34f1e74d9 \
--hash=sha256:87acbfcf8e90ca885206e98359d7dca4bcbb35abdc0ff66672a293e1d7a19101 \
--hash=sha256:87b31b6846e361ef83fedb187bb5b4372d0da3f7e28d85415efa92d6125d6e6d \
--hash=sha256:881b21b5549499972441da4758d662aeea93f1923f953e9cbaff14b8b9565aef \
--hash=sha256:8d55ab81c57b8ff8548c3e4947f119551253f4e3787a7bbc0b6b3ca47498a9d3 \
--hash=sha256:8f57a69461af2a5fa6e6bbd7a5f60d3b7e6cebb687f55106933188e79ad155c1 \
--hash=sha256:95237e53bb015f67b63c91af7518a62a8660376a6a0db19b89acc77a4d6199f5 \
--hash=sha256:96081f1605125ba0855dfda83f6f3df5ec90c61195421ba72223de35ccfb2f88 \
--hash=sha256:970919794d126ba8645f3837ab6046fb4e72bbc057b3709144066204c19a455d \
--hash=sha256:9cb1da0f5a471435a7bc7e439b8a728e8b61e59784b2af70d7c169f8dd8ae290 \
--hash=sha256:9fcd347d2cc5c23b06de6d3b7b8275be558a0c90549495c699e379a80bf8379e \
--hash=sha256:9fdac5d6ffa1b5a83bca06ffe7583f5576555e6c8b3a91fbd25ea7780f825f7d \
--hash=sha256:a11c8d26a50bfab49002947d3d237abe4d9e4b5bdc8846a63537b6488e197808 \
--hash=sha256:a144d4f717285c6d9234a66778059f33a89096dfb9b39117663fd8413d582dcc \
--hash=sha256:a2b911a5b90e0374d03813674bf0a5fbbb7741570dcd4b4e85a2e48d17def29d \
--hash=sha256:a7ec89dc587667f22b6a0b6579c249fca9026ce7c333fc142ba42411fa243cdc \
--hash=sha256:aa9d91b338f2df0508606f7009fde642391425189bba6d8c653afd80fd6bb64e \
--hash=sha256:b0379a2b24882fef529ec3b4987cb5d003b9cda32256024e6fe1586ac45fc640 \
--hash=sha256:bc7aee6f634a6f4a95676fcb5d6559a2c2a390330098dba5e5a5f28a2e4ada30 \
--hash=sha256:bdc25f3681f7b78572699569514036afe3c243bc3059d3942624e936ec93450e \
--hash=sha256:c083a3bdd5a93dfe480f1125926afcdbf2917ae714bdb80b36d34318b2bec5d9 \
--hash=sha256:c20c462aa4434b33a2661701b861604913f912254e441ab8d78d30485736115a \
--hash=sha256:c2fc0a768ef76c15ab9238afa6da7f69895bb5d1ee83aeea2e3509af4472d0b9 \
--hash=sha256:c52b02ad8b4e2cf14ca7b3d918f3eb0ee91e63b3167c32591e57c4317e134f8f \
--hash=sha256:c54c939ee22dc8e2d545da79fc5381f1c020d6d3141d3bd747eab59164dc89fb \
--hash=sha256:c8e7af2f4e0194c22b5b37205bfb293d166a7344a5b0d0eaccebc376546d77d5 \
--hash=sha256:cca3868ddfaccfbc4bfb1d608e2ccaaebe0ae628e1416aeb9c4d88c001bb45ab \
--hash=sha256:d3f26877a748dc4251cfcfda9dfb5f13fcb034f5308388066bcfe9031b63ae7d \
--hash=sha256:d53b22f2032c42eaaf025f7c40c2e3b94568ae077a606f006d206a463bc69572 \
--hash=sha256:d87c561733f66531dced0da6e864f44ebf89a8fba55f31407b00c2f7f9449593 \
--hash=sha256:d946c8bf0d5c24bf4fe333af284c59a19358aa3ec18cb3dc4370080da1e8ad29 \
--hash=sha256:dac89aea9af8cd672fa7b510e7b8c33b0bba9a43186680550ccf23020f32d535 \
--hash=sha256:db4b41f9bd95fbe5acd76d89920336ba96f03e149097365afe1cb092fceb89a1 \
--hash=sha256:dc46a01bf8d62f227d5ecee74178ffc448ff4e5197c756331f71efcc66dc980f \
--hash=sha256:dd14041875d09cc0f9308e37a6f8b65f5585cf2598a53aa0123df8b129d481f8 \
--hash=sha256:de4b83bb311557e439b9e186f733f6c645b9417c84e2eb8203f3f820a4b988bf \
--hash=sha256:e799c050df38a639db758c617ec771fd8fb7a5f8eaaa4b27b101f266b216a246 \
--hash=sha256:e80b087132752f6b3d714f041ccf74403799d3b23a72722ea2e6ba2e892555b9 \
--hash=sha256:eb8c529b2819c37140eb51b914153063d27ed88e3bdc31b71198a198e921e011 \
--hash=sha256:eb9b459ca4df0e5c87deb59d37377461a538852765293f9e6ee834f0435a93b9 \
--hash=sha256:efec8db3266b76ef9607c2c4c419bdb06bf335ae433b80816089ea7585816f6a \
--hash=sha256:f481959862f57f29601ccced557cc2e817bce7533ab8e01a797a48b49c9692b3 \
--hash=sha256:f517ca031dfc037a9c07e748cefd8d96235088b83b4f4ba8939105d20fa1dcd6 \
--hash=sha256:f889f7a40498cc077332c7ab6b4608d296d852182211787d4f3ee377aaae66e8 \
--hash=sha256:f8de619080e944347f5f20de29a975c2d815d9ddd8be9b9b7268e2e3ef68605a \
--hash=sha256:f941635f2a3d96b2973e867144fde513665c87f13fe0e193c158ac51bfaaa7b2 \
--hash=sha256:fa754d1850735a0b0e03bcffd9d4b4343eb417e47196e4485d9cca326073a42c \
--hash=sha256:fa854f5cf7e33842a892e5c73f45327760bc7bc516339fda888c75ae60edaeb6 \
--hash=sha256:fe5b32187cbc0c862ee201ad66c30cf218e5ed468ec8dc1cf49dec66e160cc4d
# via pydantic
pydantic-settings==2.10.1 \
--hash=sha256:06f0062169818d0f5524420a360d632d5857b83cffd4d42fe29597807a1614ee \
--hash=sha256:a60952460b99cf661dc25c29c0ef171721f98bfcb52ef8d9ea4c943d7c8cc796
# via -r requirements.in
pygments==2.19.2 \
--hash=sha256:636cb2477cec7f8952536970bc533bc43743542f70392ae026374600add5b887 \
--hash=sha256:86540386c03d588bb81d44bc3928634ff26449851e99741617ecb9037ee5ec0b
# via pytest
pytest==8.4.2 \
--hash=sha256:86c0d0b93306b961d58d62a4db4879f27fe25513d4b969df351abdddb3c30e01 \
--hash=sha256:872f880de3fc3a5bdc88a11b39c9710c3497a547cfa9320bc3c5e62fbf272e79
# via -r dev-requirements.in
python-dotenv==1.1.1 \
--hash=sha256:31f23644fe2602f88ff55e1f5c79ba497e01224ee7737937930c448e4d0e24dc \
--hash=sha256:a8a6399716257f45be6a007360200409fce5cda2661e3dec71d23dc15f6189ab
# via pydantic-settings
pyyaml==6.0.2 \
--hash=sha256:01179a4a8559ab5de078078f37e5c1a30d76bb88519906844fd7bdea1b7729ff \
--hash=sha256:0833f8694549e586547b576dcfaba4a6b55b9e96098b36cdc7ebefe667dfed48 \
--hash=sha256:0a9a2848a5b7feac301353437eb7d5957887edbf81d56e903999a75a3d743086 \
--hash=sha256:0b69e4ce7a131fe56b7e4d770c67429700908fc0752af059838b1cfb41960e4e \
--hash=sha256:0ffe8360bab4910ef1b9e87fb812d8bc0a308b0d0eef8c8f44e0254ab3b07133 \
--hash=sha256:11d8f3dd2b9c1207dcaf2ee0bbbfd5991f571186ec9cc78427ba5bd32afae4b5 \
--hash=sha256:17e311b6c678207928d649faa7cb0d7b4c26a0ba73d41e99c4fff6b6c3276484 \
--hash=sha256:1e2120ef853f59c7419231f3bf4e7021f1b936f6ebd222406c3b60212205d2ee \
--hash=sha256:1f71ea527786de97d1a0cc0eacd1defc0985dcf6b3f17bb77dcfc8c34bec4dc5 \
--hash=sha256:23502f431948090f597378482b4812b0caae32c22213aecf3b55325e049a6c68 \
--hash=sha256:24471b829b3bf607e04e88d79542a9d48bb037c2267d7927a874e6c205ca7e9a \
--hash=sha256:29717114e51c84ddfba879543fb232a6ed60086602313ca38cce623c1d62cfbf \
--hash=sha256:2e99c6826ffa974fe6e27cdb5ed0021786b03fc98e5ee3c5bfe1fd5015f42b99 \
--hash=sha256:39693e1f8320ae4f43943590b49779ffb98acb81f788220ea932a6b6c51004d8 \
--hash=sha256:3ad2a3decf9aaba3d29c8f537ac4b243e36bef957511b4766cb0057d32b0be85 \
--hash=sha256:3b1fdb9dc17f5a7677423d508ab4f243a726dea51fa5e70992e59a7411c89d19 \
--hash=sha256:41e4e3953a79407c794916fa277a82531dd93aad34e29c2a514c2c0c5fe971cc \
--hash=sha256:43fa96a3ca0d6b1812e01ced1044a003533c47f6ee8aca31724f78e93ccc089a \
--hash=sha256:50187695423ffe49e2deacb8cd10510bc361faac997de9efef88badc3bb9e2d1 \
--hash=sha256:5ac9328ec4831237bec75defaf839f7d4564be1e6b25ac710bd1a96321cc8317 \
--hash=sha256:5d225db5a45f21e78dd9358e58a98702a0302f2659a3c6cd320564b75b86f47c \
--hash=sha256:6395c297d42274772abc367baaa79683958044e5d3835486c16da75d2a694631 \
--hash=sha256:688ba32a1cffef67fd2e9398a2efebaea461578b0923624778664cc1c914db5d \
--hash=sha256:68ccc6023a3400877818152ad9a1033e3db8625d899c72eacb5a668902e4d652 \
--hash=sha256:70b189594dbe54f75ab3a1acec5f1e3faa7e8cf2f1e08d9b561cb41b845f69d5 \
--hash=sha256:797b4f722ffa07cc8d62053e4cff1486fa6dc094105d13fea7b1de7d8bf71c9e \
--hash=sha256:7c36280e6fb8385e520936c3cb3b8042851904eba0e58d277dca80a5cfed590b \
--hash=sha256:7e7401d0de89a9a855c839bc697c079a4af81cf878373abd7dc625847d25cbd8 \
--hash=sha256:80bab7bfc629882493af4aa31a4cfa43a4c57c83813253626916b8c7ada83476 \
--hash=sha256:82d09873e40955485746739bcb8b4586983670466c23382c19cffecbf1fd8706 \
--hash=sha256:8388ee1976c416731879ac16da0aff3f63b286ffdd57cdeb95f3f2e085687563 \
--hash=sha256:8824b5a04a04a047e72eea5cec3bc266db09e35de6bdfe34c9436ac5ee27d237 \
--hash=sha256:8b9c7197f7cb2738065c481a0461e50ad02f18c78cd75775628afb4d7137fb3b \
--hash=sha256:9056c1ecd25795207ad294bcf39f2db3d845767be0ea6e6a34d856f006006083 \
--hash=sha256:936d68689298c36b53b29f23c6dbb74de12b4ac12ca6cfe0e047bedceea56180 \
--hash=sha256:9b22676e8097e9e22e36d6b7bda33190d0d400f345f23d4065d48f4ca7ae0425 \
--hash=sha256:a4d3091415f010369ae4ed1fc6b79def9416358877534caf6a0fdd2146c87a3e \
--hash=sha256:a8786accb172bd8afb8be14490a16625cbc387036876ab6ba70912730faf8e1f \
--hash=sha256:a9f8c2e67970f13b16084e04f134610fd1d374bf477b17ec1599185cf611d725 \
--hash=sha256:bc2fa7c6b47d6bc618dd7fb02ef6fdedb1090ec036abab80d4681424b84c1183 \
--hash=sha256:c70c95198c015b85feafc136515252a261a84561b7b1d51e3384e0655ddf25ab \
--hash=sha256:cc1c1159b3d456576af7a3e4d1ba7e6924cb39de8f67111c735f6fc832082774 \
--hash=sha256:ce826d6ef20b1bc864f0a68340c8b3287705cae2f8b4b1d932177dcc76721725 \
--hash=sha256:d584d9ec91ad65861cc08d42e834324ef890a082e591037abe114850ff7bbc3e \
--hash=sha256:d7fded462629cfa4b685c5416b949ebad6cec74af5e2d42905d41e257e0869f5 \
--hash=sha256:d84a1718ee396f54f3a086ea0a66d8e552b2ab2017ef8b420e92edbc841c352d \
--hash=sha256:d8e03406cac8513435335dbab54c0d385e4a49e4945d2909a581c83647ca0290 \
--hash=sha256:e10ce637b18caea04431ce14fabcf5c64a1c61ec9c56b071a4b7ca131ca52d44 \
--hash=sha256:ec031d5d2feb36d1d1a24380e4db6d43695f3748343d99434e6f5f9156aaa2ed \
--hash=sha256:ef6107725bd54b262d6dedcc2af448a266975032bc85ef0172c5f059da6325b4 \
--hash=sha256:efdca5630322a10774e8e98e1af481aad470dd62c3170801852d752aa7a783ba \
--hash=sha256:f753120cb8181e736c57ef7636e83f31b9c0d1722c516f7e86cf15b7aa57ff12 \
--hash=sha256:ff3824dc5261f50c9b0dfb3be22b4567a6f938ccce4587b38952d85fd9e9afe4
# via -r requirements.in
sniffio==1.3.1 \
--hash=sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2 \
--hash=sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc
# via anyio
sqlalchemy==2.0.43 \
--hash=sha256:022e436a1cb39b13756cf93b48ecce7aa95382b9cfacceb80a7d263129dfd019 \
--hash=sha256:03d73ab2a37d9e40dec4984d1813d7878e01dbdc742448d44a7341b7a9f408c7 \
--hash=sha256:07097c0a1886c150ef2adba2ff7437e84d40c0f7dcb44a2c2b9c905ccfc6361c \
--hash=sha256:11b9503fa6f8721bef9b8567730f664c5a5153d25e247aadc69247c4bc605227 \
--hash=sha256:11f43c39b4b2ec755573952bbcc58d976779d482f6f832d7f33a8d869ae891bf \
--hash=sha256:13194276e69bb2af56198fef7909d48fd34820de01d9c92711a5fa45497cc7ed \
--hash=sha256:136063a68644eca9339d02e6693932116f6a8591ac013b0014479a1de664e40a \
--hash=sha256:14111d22c29efad445cd5021a70a8b42f7d9152d8ba7f73304c4d82460946aaa \
--hash=sha256:1681c21dd2ccee222c2fe0bef671d1aef7c504087c9c4e800371cfcc8ac966fc \
--hash=sha256:1a113da919c25f7f641ffbd07fbc9077abd4b3b75097c888ab818f962707eb48 \
--hash=sha256:1c6d85327ca688dbae7e2b06d7d84cfe4f3fffa5b5f9e21bb6ce9d0e1a0e0e0a \
--hash=sha256:20d81fc2736509d7a2bd33292e489b056cbae543661bb7de7ce9f1c0cd6e7f24 \
--hash=sha256:21b27b56eb2f82653168cefe6cb8e970cdaf4f3a6cb2c5e3c3c1cf3158968ff9 \
--hash=sha256:21ba7a08a4253c5825d1db389d4299f64a100ef9800e4624c8bf70d8f136e6ed \
--hash=sha256:227119ce0a89e762ecd882dc661e0aa677a690c914e358f0dd8932a2e8b2765b \
--hash=sha256:25b9fc27650ff5a2c9d490c13c14906b918b0de1f8fcbb4c992712d8caf40e83 \
--hash=sha256:334f41fa28de9f9be4b78445e68530da3c5fa054c907176460c81494f4ae1f5e \
--hash=sha256:413391b2239db55be14fa4223034d7e13325a1812c8396ecd4f2c08696d5ccad \
--hash=sha256:4286a1139f14b7d70141c67a8ae1582fc2b69105f1b09d9573494eb4bb4b2687 \
--hash=sha256:44337823462291f17f994d64282a71c51d738fc9ef561bf265f1d0fd9116a782 \
--hash=sha256:46293c39252f93ea0910aababa8752ad628bcce3a10d3f260648dd472256983f \
--hash=sha256:4bf0edb24c128b7be0c61cd17eef432e4bef507013292415f3fb7023f02b7d4b \
--hash=sha256:4d3d9b904ad4a6b175a2de0738248822f5ac410f52c2fd389ada0b5262d6a1e3 \
--hash=sha256:4e6aeb2e0932f32950cf56a8b4813cb15ff792fc0c9b3752eaf067cfe298496a \
--hash=sha256:4fb1a8c5438e0c5ea51afe9c6564f951525795cf432bed0c028c1cb081276685 \
--hash=sha256:529064085be2f4d8a6e5fab12d36ad44f1909a18848fcfbdb59cc6d4bbe48efe \
--hash=sha256:52d9b73b8fb3e9da34c2b31e6d99d60f5f99fd8c1225c9dad24aeb74a91e1d29 \
--hash=sha256:5cda6b51faff2639296e276591808c1726c4a77929cfaa0f514f30a5f6156921 \
--hash=sha256:5d79f9fdc9584ec83d1b3c75e9f4595c49017f5594fee1a2217117647225d738 \
--hash=sha256:61f964a05356f4bca4112e6334ed7c208174511bd56e6b8fc86dad4d024d4185 \
--hash=sha256:6772e3ca8a43a65a37c88e2f3e2adfd511b0b1da37ef11ed78dea16aeae85bd9 \
--hash=sha256:6e2bf13d9256398d037fef09fd8bf9b0bf77876e22647d10761d35593b9ac547 \
--hash=sha256:70322986c0c699dca241418fcf18e637a4369e0ec50540a2b907b184c8bca069 \
--hash=sha256:788bfcef6787a7764169cfe9859fe425bf44559619e1d9f56f5bddf2ebf6f417 \
--hash=sha256:7f1ac7828857fcedb0361b48b9ac4821469f7694089d15550bbcf9ab22564a1d \
--hash=sha256:87accdbba88f33efa7b592dc2e8b2a9c2cdbca73db2f9d5c510790428c09c154 \
--hash=sha256:8cee08f15d9e238ede42e9bbc1d6e7158d0ca4f176e4eab21f88ac819ae3bd7b \
--hash=sha256:971ba928fcde01869361f504fcff3b7143b47d30de188b11c6357c0505824197 \
--hash=sha256:9c2e02f06c68092b875d5cbe4824238ab93a7fa35d9c38052c033f7ca45daa18 \
--hash=sha256:9c5a9da957c56e43d72126a3f5845603da00e0293720b03bde0aacffcf2dc04f \
--hash=sha256:9df7126fd9db49e3a5a3999442cc67e9ee8971f3cb9644250107d7296cb2a164 \
--hash=sha256:b3edaec7e8b6dc5cd94523c6df4f294014df67097c8217a89929c99975811414 \
--hash=sha256:b535d35dea8bbb8195e7e2b40059e2253acb2b7579b73c1b432a35363694641d \
--hash=sha256:bcf0724a62a5670e5718957e05c56ec2d6850267ea859f8ad2481838f889b42c \
--hash=sha256:c00e7845d2f692ebfc7d5e4ec1a3fd87698e4337d09e58d6749a16aedfdf8612 \
--hash=sha256:c379e37b08c6c527181a397212346be39319fb64323741d23e46abd97a400d34 \
--hash=sha256:c5d1730b25d9a07727d20ad74bc1039bbbb0a6ca24e6769861c1aa5bf2c4c4a8 \
--hash=sha256:c5e73ba0d76eefc82ec0219d2301cb33bfe5205ed7a2602523111e2e56ccbd20 \
--hash=sha256:c697575d0e2b0a5f0433f679bda22f63873821d991e95a90e9e52aae517b2e32 \
--hash=sha256:cdeff998cb294896a34e5b2f00e383e7c5c4ef3b4bfa375d9104723f15186443 \
--hash=sha256:ceb5c832cc30663aeaf5e39657712f4c4241ad1f638d487ef7216258f6d41fe7 \
--hash=sha256:d34c0f6dbefd2e816e8f341d0df7d4763d382e3f452423e752ffd1e213da2512 \
--hash=sha256:db691fa174e8f7036afefe3061bc40ac2b770718be2862bfb03aabae09051aca \
--hash=sha256:e7a903b5b45b0d9fa03ac6a331e1c1d6b7e0ab41c63b6217b3d10357b83c8b00 \
--hash=sha256:e7c08f57f75a2bb62d7ee80a89686a5e5669f199235c6d1dac75cd59374091c3 \
--hash=sha256:f42f23e152e4545157fa367b2435a1ace7571cab016ca26038867eb7df2c3631 \
--hash=sha256:fe2b3b4927d0bc03d02ad883f402d5de201dbc8894ac87d2e981e7d87430e60d
# via sqlmodel
sqlmodel==0.0.24 \
--hash=sha256:6778852f09370908985b667d6a3ab92910d0d5ec88adcaf23dbc242715ff7193 \
--hash=sha256:cc5c7613c1a5533c9c7867e1aab2fd489a76c9e8a061984da11b4e613c182423
# via -r requirements.in
starlette==0.47.3 \
--hash=sha256:6bc94f839cc176c4858894f1f8908f0ab79dfec1a6b8402f6da9be26ebea52e9 \
--hash=sha256:89c0778ca62a76b826101e7c709e70680a1699ca7da6b44d38eb0a7e61fe4b51
# via fastapi
typing-extensions==4.15.0 \
--hash=sha256:0cea48d173cc12fa28ecabc3b837ea3cf6f38c6d1136f85cbaaf598984861466 \
--hash=sha256:f0fa19c6845758ab08074a0cfa8b7aecb71c999ca73d62883bc25cc018c4e548
# via
# anyio
# fastapi
# pydantic
# pydantic-core
# sqlalchemy
# starlette
# typing-inspection
typing-inspection==0.4.1 \
--hash=sha256:389055682238f53b04f7badcb49b989835495a96700ced5dab2d8feae4b26f51 \
--hash=sha256:6ae134cc0203c33377d43188d4064e9b357dba58cff3185f22924610e70a9d28
# via
# pydantic
# pydantic-settings
uvicorn==0.35.0 \
--hash=sha256:197535216b25ff9b785e29a0b79199f55222193d47f820816e7da751e9bc8d4a \
--hash=sha256:bc662f087f7cf2ce11a1d7fd70b90c9f98ef2e2831556dd078d131b96cc94a01
# via -r requirements.in

View File

@@ -1,7 +0,0 @@
from typing import Optional
from pydantic import BaseModel
class MsgPayload(BaseModel):
msg_id: Optional[int]
msg_name: str

554
backend/openapi.yaml Normal file
View File

@@ -0,0 +1,554 @@
openapi: "3.0.3"
info:
title: Trading Journal API
version: "1.0.0"
description: OpenAPI description generated from [`app.py`](app.py) and DTOs in [`trading_journal/dto.py`](trading_journal/dto.py).
servers:
- url: "http://127.0.0.1:18881{basePath}"
variables:
basePath:
default: "/api/v1"
description: "API base path (matches settings.settings.api_base)"
components:
securitySchemes:
session_cookie:
type: apiKey
in: cookie
name: session_token
schemas:
UserCreate:
$ref: "#/components/schemas/UserCreate_impl"
UserCreate_impl:
type: object
required:
- username
- password
properties:
username:
type: string
is_active:
type: boolean
default: true
password:
type: string
UserLogin:
type: object
required:
- username
- password
properties:
username:
type: string
password:
type: string
UserRead:
type: object
required:
- id
- username
properties:
id:
type: integer
username:
type: string
is_active:
type: boolean
SessionsBase:
type: object
required:
- user_id
properties:
user_id:
type: integer
SessionsCreate:
allOf:
- $ref: "#/components/schemas/SessionsBase"
- type: object
required:
- expires_at
properties:
expires_at:
type: string
format: date-time
ExchangesBase:
type: object
required:
- name
properties:
name:
type: string
notes:
type: string
nullable: true
ExchangesRead:
allOf:
- $ref: "#/components/schemas/ExchangesBase"
- type: object
required:
- id
properties:
id:
type: integer
CycleBase:
type: object
properties:
friendly_name:
type: string
nullable: true
status:
type: string
end_date:
type: string
format: date
nullable: true
funding_source:
type: string
nullable: true
capital_exposure_cents:
type: integer
nullable: true
loan_amount_cents:
type: integer
nullable: true
loan_interest_rate_tenth_bps:
type: integer
nullable: true
trades:
type: array
items:
$ref: "#/components/schemas/TradeRead"
nullable: true
exchange:
$ref: "#/components/schemas/ExchangesRead"
nullable: true
CycleCreate:
allOf:
- $ref: "#/components/schemas/CycleBase"
- type: object
required:
- user_id
- symbol
- exchange_id
- underlying_currency
- start_date
properties:
user_id:
type: integer
symbol:
type: string
exchange_id:
type: integer
underlying_currency:
type: string
start_date:
type: string
format: date
CycleUpdate:
allOf:
- $ref: "#/components/schemas/CycleBase"
- type: object
required:
- id
properties:
id:
type: integer
CycleRead:
allOf:
- $ref: "#/components/schemas/CycleCreate"
- type: object
required:
- id
properties:
id:
type: integer
TradeBase:
type: object
required:
- symbol
- underlying_currency
- trade_type
- trade_strategy
- trade_date
- quantity
- price_cents
- commission_cents
properties:
friendly_name:
type: string
nullable: true
symbol:
type: string
exchange_id:
type: integer
underlying_currency:
type: string
trade_type:
type: string
trade_strategy:
type: string
trade_date:
type: string
format: date
quantity:
type: integer
price_cents:
type: integer
commission_cents:
type: integer
notes:
type: string
nullable: true
cycle_id:
type: integer
nullable: true
TradeCreate:
allOf:
- $ref: "#/components/schemas/TradeBase"
- type: object
properties:
user_id:
type: integer
nullable: true
trade_time_utc:
type: string
format: date-time
nullable: true
gross_cash_flow_cents:
type: integer
nullable: true
net_cash_flow_cents:
type: integer
nullable: true
quantity_multiplier:
type: integer
default: 1
expiry_date:
type: string
format: date
nullable: true
strike_price_cents:
type: integer
nullable: true
is_invalidated:
type: boolean
default: false
invalidated_at:
type: string
format: date-time
nullable: true
replaced_by_trade_id:
type: integer
nullable: true
TradeNoteUpdate:
type: object
required:
- id
properties:
id:
type: integer
notes:
type: string
nullable: true
TradeFriendlyNameUpdate:
type: object
required:
- id
- friendly_name
properties:
id:
type: integer
friendly_name:
type: string
TradeRead:
allOf:
- $ref: "#/components/schemas/TradeCreate"
- type: object
required:
- id
properties:
id:
type: integer
paths:
/status:
get:
summary: "Get API status"
security: [] # no auth required
responses:
"200":
description: OK
content:
application/json:
schema:
type: object
properties:
status:
type: string
/register:
post:
summary: "Register user"
security: [] # no auth required
requestBody:
required: true
content:
application/json:
schema:
$ref: "#/components/schemas/UserCreate"
responses:
"201":
description: Created
content:
application/json:
schema:
$ref: "#/components/schemas/UserRead"
"400":
description: Bad Request (user exists)
"500":
description: Internal Server Error
/login:
post:
summary: "Login"
security: [] # no auth required
requestBody:
required: true
content:
application/json:
schema:
$ref: "#/components/schemas/UserLogin"
responses:
"200":
description: OK (sets session cookie)
content:
application/json:
schema:
$ref: "#/components/schemas/SessionsBase"
headers:
Set-Cookie:
description: session cookie
schema:
type: string
"401":
description: Unauthorized
"500":
description: Internal Server Error
/exchanges:
post:
summary: "Create exchange"
security:
- session_cookie: []
requestBody:
required: true
content:
application/json:
schema:
$ref: "#/components/schemas/ExchangesBase"
responses:
"201":
description: Created
content:
application/json:
schema:
$ref: "#/components/schemas/ExchangesRead"
"400":
description: Bad Request
"401":
description: Unauthorized
get:
summary: "List user exchanges"
security:
- session_cookie: []
responses:
"200":
description: OK
content:
application/json:
schema:
type: array
items:
$ref: "#/components/schemas/ExchangesRead"
"401":
description: Unauthorized
/exchanges/{exchange_id}:
patch:
summary: "Update exchange"
security:
- session_cookie: []
parameters:
- name: exchange_id
in: path
required: true
schema:
type: integer
requestBody:
required: true
content:
application/json:
schema:
$ref: "#/components/schemas/ExchangesBase"
responses:
"200":
description: Updated
content:
application/json:
schema:
$ref: "#/components/schemas/ExchangesRead"
"404":
description: Not found
"400":
description: Bad request
/cycles:
post:
summary: "Create cycle (currently returns 405 in code)"
security:
- session_cookie: []
requestBody:
required: true
content:
application/json:
schema:
$ref: "#/components/schemas/CycleBase"
responses:
"405":
description: Method not allowed (app currently returns 405)
patch:
summary: "Update cycle"
security:
- session_cookie: []
requestBody:
required: true
content:
application/json:
schema:
$ref: "#/components/schemas/CycleUpdate"
responses:
"200":
description: Updated
content:
application/json:
schema:
$ref: "#/components/schemas/CycleRead"
"400":
description: Invalid data
"404":
description: Not found
/cycles/{cycle_id}:
get:
summary: "Get cycle by id"
security:
- session_cookie: []
parameters:
- name: cycle_id
in: path
required: true
schema:
type: integer
responses:
"200":
description: OK
content:
application/json:
schema:
$ref: "#/components/schemas/CycleRead"
"404":
description: Not found
/cycles/user/{user_id}:
get:
summary: "Get cycles by user id"
security:
- session_cookie: []
parameters:
- name: user_id
in: path
required: true
schema:
type: integer
responses:
"200":
description: OK
content:
application/json:
schema:
type: array
items:
$ref: "#/components/schemas/CycleRead"
/trades:
post:
summary: "Create trade"
security:
- session_cookie: []
requestBody:
required: true
content:
application/json:
schema:
$ref: "#/components/schemas/TradeCreate"
responses:
"201":
description: Created
content:
application/json:
schema:
$ref: "#/components/schemas/TradeRead"
"400":
description: Invalid trade data
"500":
description: Internal Server Error
/trades/{trade_id}:
get:
summary: "Get trade by id"
security:
- session_cookie: []
parameters:
- name: trade_id
in: path
required: true
schema:
type: integer
responses:
"200":
description: OK
content:
application/json:
schema:
$ref: "#/components/schemas/TradeRead"
"404":
description: Not found
/trades/friendlyname:
patch:
summary: "Update trade friendly name"
security:
- session_cookie: []
requestBody:
required: true
content:
application/json:
schema:
$ref: "#/components/schemas/TradeFriendlyNameUpdate"
responses:
"200":
description: Updated
content:
application/json:
schema:
$ref: "#/components/schemas/TradeRead"
"404":
description: Not found
/trades/notes:
patch:
summary: "Update trade notes"
security:
- session_cookie: []
requestBody:
required: true
content:
application/json:
schema:
$ref: "#/components/schemas/TradeNoteUpdate"
responses:
"200":
description: Updated
content:
application/json:
schema:
$ref: "#/components/schemas/TradeRead"
"404":
description: Not found

7
backend/requirements.in Normal file
View File

@@ -0,0 +1,7 @@
fastapi
uvicorn
httpx
pyyaml
pydantic-settings
sqlmodel
argon2-cffi

View File

@@ -1,5 +1,491 @@
fastapi
uvicorn
httpx
pyyaml
pydantic-settings
#
# This file is autogenerated by pip-compile with Python 3.11
# by the following command:
#
# pip-compile --generate-hashes requirements.in
#
annotated-types==0.7.0 \
--hash=sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53 \
--hash=sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89
# via pydantic
anyio==4.10.0 \
--hash=sha256:3f3fae35c96039744587aa5b8371e7e8e603c0702999535961dd336026973ba6 \
--hash=sha256:60e474ac86736bbfd6f210f7a61218939c318f43f9972497381f1c5e930ed3d1
# via
# httpx
# starlette
argon2-cffi==25.1.0 \
--hash=sha256:694ae5cc8a42f4c4e2bf2ca0e64e51e23a040c6a517a85074683d3959e1346c1 \
--hash=sha256:fdc8b074db390fccb6eb4a3604ae7231f219aa669a2652e0f20e16ba513d5741
# via -r requirements.in
argon2-cffi-bindings==25.1.0 \
--hash=sha256:1db89609c06afa1a214a69a462ea741cf735b29a57530478c06eb81dd403de99 \
--hash=sha256:1e021e87faa76ae0d413b619fe2b65ab9a037f24c60a1e6cc43457ae20de6dc6 \
--hash=sha256:21378b40e1b8d1655dd5310c84a40fc19a9aa5e6366e835ceb8576bf0fea716d \
--hash=sha256:2630b6240b495dfab90aebe159ff784d08ea999aa4b0d17efa734055a07d2f44 \
--hash=sha256:3c6702abc36bf3ccba3f802b799505def420a1b7039862014a65db3205967f5a \
--hash=sha256:3d3f05610594151994ca9ccb3c771115bdb4daef161976a266f0dd8aa9996b8f \
--hash=sha256:473bcb5f82924b1becbb637b63303ec8d10e84c8d241119419897a26116515d2 \
--hash=sha256:5acb4e41090d53f17ca1110c3427f0a130f944b896fc8c83973219c97f57b690 \
--hash=sha256:5d588dec224e2a83edbdc785a5e6f3c6cd736f46bfd4b441bbb5aa1f5085e584 \
--hash=sha256:6dca33a9859abf613e22733131fc9194091c1fa7cb3e131c143056b4856aa47e \
--hash=sha256:7aef0c91e2c0fbca6fc68e7555aa60ef7008a739cbe045541e438373bc54d2b0 \
--hash=sha256:84a461d4d84ae1295871329b346a97f68eade8c53b6ed9a7ca2d7467f3c8ff6f \
--hash=sha256:87c33a52407e4c41f3b70a9c2d3f6056d88b10dad7695be708c5021673f55623 \
--hash=sha256:8b8efee945193e667a396cbc7b4fb7d357297d6234d30a489905d96caabde56b \
--hash=sha256:a1c70058c6ab1e352304ac7e3b52554daadacd8d453c1752e547c76e9c99ac44 \
--hash=sha256:a98cd7d17e9f7ce244c0803cad3c23a7d379c301ba618a5fa76a67d116618b98 \
--hash=sha256:aecba1723ae35330a008418a91ea6cfcedf6d31e5fbaa056a166462ff066d500 \
--hash=sha256:b0fdbcf513833809c882823f98dc2f931cf659d9a1429616ac3adebb49f5db94 \
--hash=sha256:b55aec3565b65f56455eebc9b9f34130440404f27fe21c3b375bf1ea4d8fbae6 \
--hash=sha256:b957f3e6ea4d55d820e40ff76f450952807013d361a65d7f28acc0acbf29229d \
--hash=sha256:ba92837e4a9aa6a508c8d2d7883ed5a8f6c308c89a4790e1e447a220deb79a85 \
--hash=sha256:c4f9665de60b1b0e99bcd6be4f17d90339698ce954cfd8d9cf4f91c995165a92 \
--hash=sha256:c87b72589133f0346a1cb8d5ecca4b933e3c9b64656c9d175270a000e73b288d \
--hash=sha256:d3e924cfc503018a714f94a49a149fdc0b644eaead5d1f089330399134fa028a \
--hash=sha256:da0c79c23a63723aa5d782250fbf51b768abca630285262fb5144ba5ae01e520 \
--hash=sha256:e2fd3bfbff3c5d74fef31a722f729bf93500910db650c925c2d6ef879a7e51cb
# via argon2-cffi
certifi==2025.8.3 \
--hash=sha256:e564105f78ded564e3ae7c923924435e1daa7463faeab5bb932bc53ffae63407 \
--hash=sha256:f6c12493cfb1b06ba2ff328595af9350c65d6644968e5d3a2ffd78699af217a5
# via
# httpcore
# httpx
cffi==2.0.0 \
--hash=sha256:00bdf7acc5f795150faa6957054fbbca2439db2f775ce831222b66f192f03beb \
--hash=sha256:07b271772c100085dd28b74fa0cd81c8fb1a3ba18b21e03d7c27f3436a10606b \
--hash=sha256:087067fa8953339c723661eda6b54bc98c5625757ea62e95eb4898ad5e776e9f \
--hash=sha256:0a1527a803f0a659de1af2e1fd700213caba79377e27e4693648c2923da066f9 \
--hash=sha256:0cf2d91ecc3fcc0625c2c530fe004f82c110405f101548512cce44322fa8ac44 \
--hash=sha256:0f6084a0ea23d05d20c3edcda20c3d006f9b6f3fefeac38f59262e10cef47ee2 \
--hash=sha256:12873ca6cb9b0f0d3a0da705d6086fe911591737a59f28b7936bdfed27c0d47c \
--hash=sha256:19f705ada2530c1167abacb171925dd886168931e0a7b78f5bffcae5c6b5be75 \
--hash=sha256:1cd13c99ce269b3ed80b417dcd591415d3372bcac067009b6e0f59c7d4015e65 \
--hash=sha256:1e3a615586f05fc4065a8b22b8152f0c1b00cdbc60596d187c2a74f9e3036e4e \
--hash=sha256:1f72fb8906754ac8a2cc3f9f5aaa298070652a0ffae577e0ea9bd480dc3c931a \
--hash=sha256:1fc9ea04857caf665289b7a75923f2c6ed559b8298a1b8c49e59f7dd95c8481e \
--hash=sha256:203a48d1fb583fc7d78a4c6655692963b860a417c0528492a6bc21f1aaefab25 \
--hash=sha256:2081580ebb843f759b9f617314a24ed5738c51d2aee65d31e02f6f7a2b97707a \
--hash=sha256:21d1152871b019407d8ac3985f6775c079416c282e431a4da6afe7aefd2bccbe \
--hash=sha256:24b6f81f1983e6df8db3adc38562c83f7d4a0c36162885ec7f7b77c7dcbec97b \
--hash=sha256:256f80b80ca3853f90c21b23ee78cd008713787b1b1e93eae9f3d6a7134abd91 \
--hash=sha256:28a3a209b96630bca57cce802da70c266eb08c6e97e5afd61a75611ee6c64592 \
--hash=sha256:2c8f814d84194c9ea681642fd164267891702542f028a15fc97d4674b6206187 \
--hash=sha256:2de9a304e27f7596cd03d16f1b7c72219bd944e99cc52b84d0145aefb07cbd3c \
--hash=sha256:38100abb9d1b1435bc4cc340bb4489635dc2f0da7456590877030c9b3d40b0c1 \
--hash=sha256:3925dd22fa2b7699ed2617149842d2e6adde22b262fcbfada50e3d195e4b3a94 \
--hash=sha256:3e17ed538242334bf70832644a32a7aae3d83b57567f9fd60a26257e992b79ba \
--hash=sha256:3e837e369566884707ddaf85fc1744b47575005c0a229de3327f8f9a20f4efeb \
--hash=sha256:3f4d46d8b35698056ec29bca21546e1551a205058ae1a181d871e278b0b28165 \
--hash=sha256:44d1b5909021139fe36001ae048dbdde8214afa20200eda0f64c068cac5d5529 \
--hash=sha256:45d5e886156860dc35862657e1494b9bae8dfa63bf56796f2fb56e1679fc0bca \
--hash=sha256:4647afc2f90d1ddd33441e5b0e85b16b12ddec4fca55f0d9671fef036ecca27c \
--hash=sha256:4671d9dd5ec934cb9a73e7ee9676f9362aba54f7f34910956b84d727b0d73fb6 \
--hash=sha256:53f77cbe57044e88bbd5ed26ac1d0514d2acf0591dd6bb02a3ae37f76811b80c \
--hash=sha256:5eda85d6d1879e692d546a078b44251cdd08dd1cfb98dfb77b670c97cee49ea0 \
--hash=sha256:5fed36fccc0612a53f1d4d9a816b50a36702c28a2aa880cb8a122b3466638743 \
--hash=sha256:61d028e90346df14fedc3d1e5441df818d095f3b87d286825dfcbd6459b7ef63 \
--hash=sha256:66f011380d0e49ed280c789fbd08ff0d40968ee7b665575489afa95c98196ab5 \
--hash=sha256:6824f87845e3396029f3820c206e459ccc91760e8fa24422f8b0c3d1731cbec5 \
--hash=sha256:6c6c373cfc5c83a975506110d17457138c8c63016b563cc9ed6e056a82f13ce4 \
--hash=sha256:6d02d6655b0e54f54c4ef0b94eb6be0607b70853c45ce98bd278dc7de718be5d \
--hash=sha256:6d50360be4546678fc1b79ffe7a66265e28667840010348dd69a314145807a1b \
--hash=sha256:730cacb21e1bdff3ce90babf007d0a0917cc3e6492f336c2f0134101e0944f93 \
--hash=sha256:737fe7d37e1a1bffe70bd5754ea763a62a066dc5913ca57e957824b72a85e205 \
--hash=sha256:74a03b9698e198d47562765773b4a8309919089150a0bb17d829ad7b44b60d27 \
--hash=sha256:7553fb2090d71822f02c629afe6042c299edf91ba1bf94951165613553984512 \
--hash=sha256:7a66c7204d8869299919db4d5069a82f1561581af12b11b3c9f48c584eb8743d \
--hash=sha256:7cc09976e8b56f8cebd752f7113ad07752461f48a58cbba644139015ac24954c \
--hash=sha256:81afed14892743bbe14dacb9e36d9e0e504cd204e0b165062c488942b9718037 \
--hash=sha256:8941aaadaf67246224cee8c3803777eed332a19d909b47e29c9842ef1e79ac26 \
--hash=sha256:89472c9762729b5ae1ad974b777416bfda4ac5642423fa93bd57a09204712322 \
--hash=sha256:8ea985900c5c95ce9db1745f7933eeef5d314f0565b27625d9a10ec9881e1bfb \
--hash=sha256:8eca2a813c1cb7ad4fb74d368c2ffbbb4789d377ee5bb8df98373c2cc0dee76c \
--hash=sha256:92b68146a71df78564e4ef48af17551a5ddd142e5190cdf2c5624d0c3ff5b2e8 \
--hash=sha256:9332088d75dc3241c702d852d4671613136d90fa6881da7d770a483fd05248b4 \
--hash=sha256:94698a9c5f91f9d138526b48fe26a199609544591f859c870d477351dc7b2414 \
--hash=sha256:9a67fc9e8eb39039280526379fb3a70023d77caec1852002b4da7e8b270c4dd9 \
--hash=sha256:9de40a7b0323d889cf8d23d1ef214f565ab154443c42737dfe52ff82cf857664 \
--hash=sha256:a05d0c237b3349096d3981b727493e22147f934b20f6f125a3eba8f994bec4a9 \
--hash=sha256:afb8db5439b81cf9c9d0c80404b60c3cc9c3add93e114dcae767f1477cb53775 \
--hash=sha256:b18a3ed7d5b3bd8d9ef7a8cb226502c6bf8308df1525e1cc676c3680e7176739 \
--hash=sha256:b1e74d11748e7e98e2f426ab176d4ed720a64412b6a15054378afdb71e0f37dc \
--hash=sha256:b21e08af67b8a103c71a250401c78d5e0893beff75e28c53c98f4de42f774062 \
--hash=sha256:b4c854ef3adc177950a8dfc81a86f5115d2abd545751a304c5bcf2c2c7283cfe \
--hash=sha256:b882b3df248017dba09d6b16defe9b5c407fe32fc7c65a9c69798e6175601be9 \
--hash=sha256:baf5215e0ab74c16e2dd324e8ec067ef59e41125d3eade2b863d294fd5035c92 \
--hash=sha256:c649e3a33450ec82378822b3dad03cc228b8f5963c0c12fc3b1e0ab940f768a5 \
--hash=sha256:c654de545946e0db659b3400168c9ad31b5d29593291482c43e3564effbcee13 \
--hash=sha256:c6638687455baf640e37344fe26d37c404db8b80d037c3d29f58fe8d1c3b194d \
--hash=sha256:c8d3b5532fc71b7a77c09192b4a5a200ea992702734a2e9279a37f2478236f26 \
--hash=sha256:cb527a79772e5ef98fb1d700678fe031e353e765d1ca2d409c92263c6d43e09f \
--hash=sha256:cf364028c016c03078a23b503f02058f1814320a56ad535686f90565636a9495 \
--hash=sha256:d48a880098c96020b02d5a1f7d9251308510ce8858940e6fa99ece33f610838b \
--hash=sha256:d68b6cef7827e8641e8ef16f4494edda8b36104d79773a334beaa1e3521430f6 \
--hash=sha256:d9b29c1f0ae438d5ee9acb31cadee00a58c46cc9c0b2f9038c6b0b3470877a8c \
--hash=sha256:d9b97165e8aed9272a6bb17c01e3cc5871a594a446ebedc996e2397a1c1ea8ef \
--hash=sha256:da68248800ad6320861f129cd9c1bf96ca849a2771a59e0344e88681905916f5 \
--hash=sha256:da902562c3e9c550df360bfa53c035b2f241fed6d9aef119048073680ace4a18 \
--hash=sha256:dbd5c7a25a7cb98f5ca55d258b103a2054f859a46ae11aaf23134f9cc0d356ad \
--hash=sha256:dd4f05f54a52fb558f1ba9f528228066954fee3ebe629fc1660d874d040ae5a3 \
--hash=sha256:de8dad4425a6ca6e4e5e297b27b5c824ecc7581910bf9aee86cb6835e6812aa7 \
--hash=sha256:e11e82b744887154b182fd3e7e8512418446501191994dbf9c9fc1f32cc8efd5 \
--hash=sha256:e6e73b9e02893c764e7e8d5bb5ce277f1a009cd5243f8228f75f842bf937c534 \
--hash=sha256:f73b96c41e3b2adedc34a7356e64c8eb96e03a3782b535e043a986276ce12a49 \
--hash=sha256:f93fd8e5c8c0a4aa1f424d6173f14a892044054871c771f8566e4008eaa359d2 \
--hash=sha256:fc33c5141b55ed366cfaad382df24fe7dcbc686de5be719b207bb248e3053dc5 \
--hash=sha256:fc7de24befaeae77ba923797c7c87834c73648a05a4bde34b3b7e5588973a453 \
--hash=sha256:fe562eb1a64e67dd297ccc4f5addea2501664954f2692b69a76449ec7913ecbf
# via argon2-cffi-bindings
click==8.2.1 \
--hash=sha256:27c491cc05d968d271d5a1db13e3b5a184636d9d930f148c50b038f0d0646202 \
--hash=sha256:61a3265b914e850b85317d0b3109c7f8cd35a670f963866005d6ef1d5175a12b
# via uvicorn
fastapi==0.116.1 \
--hash=sha256:c46ac7c312df840f0c9e220f7964bada936781bc4e2e6eb71f1c4d7553786565 \
--hash=sha256:ed52cbf946abfd70c5a0dccb24673f0670deeb517a88b3544d03c2a6bf283143
# via -r requirements.in
greenlet==3.2.4 \
--hash=sha256:00fadb3fedccc447f517ee0d3fd8fe49eae949e1cd0f6a611818f4f6fb7dc83b \
--hash=sha256:061dc4cf2c34852b052a8620d40f36324554bc192be474b9e9770e8c042fd735 \
--hash=sha256:0db5594dce18db94f7d1650d7489909b57afde4c580806b8d9203b6e79cdc079 \
--hash=sha256:0dca0d95ff849f9a364385f36ab49f50065d76964944638be9691e1832e9f86d \
--hash=sha256:16458c245a38991aa19676900d48bd1a6f2ce3e16595051a4db9d012154e8433 \
--hash=sha256:18d9260df2b5fbf41ae5139e1be4e796d99655f023a636cd0e11e6406cca7d58 \
--hash=sha256:1987de92fec508535687fb807a5cea1560f6196285a4cde35c100b8cd632cc52 \
--hash=sha256:1a921e542453fe531144e91e1feedf12e07351b1cf6c9e8a3325ea600a715a31 \
--hash=sha256:1ee8fae0519a337f2329cb78bd7a8e128ec0f881073d43f023c7b8d4831d5246 \
--hash=sha256:20fb936b4652b6e307b8f347665e2c615540d4b42b3b4c8a321d8286da7e520f \
--hash=sha256:23768528f2911bcd7e475210822ffb5254ed10d71f4028387e5a99b4c6699671 \
--hash=sha256:2523e5246274f54fdadbce8494458a2ebdcdbc7b802318466ac5606d3cded1f8 \
--hash=sha256:27890167f55d2387576d1f41d9487ef171849ea0359ce1510ca6e06c8bece11d \
--hash=sha256:299fd615cd8fc86267b47597123e3f43ad79c9d8a22bebdce535e53550763e2f \
--hash=sha256:3b3812d8d0c9579967815af437d96623f45c0f2ae5f04e366de62a12d83a8fb0 \
--hash=sha256:3b67ca49f54cede0186854a008109d6ee71f66bd57bb36abd6d0a0267b540cdd \
--hash=sha256:44358b9bf66c8576a9f57a590d5f5d6e72fa4228b763d0e43fee6d3b06d3a337 \
--hash=sha256:49a30d5fda2507ae77be16479bdb62a660fa51b1eb4928b524975b3bde77b3c0 \
--hash=sha256:4d1378601b85e2e5171b99be8d2dc85f594c79967599328f95c1dc1a40f1c633 \
--hash=sha256:554b03b6e73aaabec3745364d6239e9e012d64c68ccd0b8430c64ccc14939a8b \
--hash=sha256:55e9c5affaa6775e2c6b67659f3a71684de4c549b3dd9afca3bc773533d284fa \
--hash=sha256:58b97143c9cc7b86fc458f215bd0932f1757ce649e05b640fea2e79b54cedb31 \
--hash=sha256:5c9320971821a7cb77cfab8d956fa8e39cd07ca44b6070db358ceb7f8797c8c9 \
--hash=sha256:65458b409c1ed459ea899e939f0e1cdb14f58dbc803f2f93c5eab5694d32671b \
--hash=sha256:671df96c1f23c4a0d4077a325483c1503c96a1b7d9db26592ae770daa41233d4 \
--hash=sha256:710638eb93b1fa52823aa91bf75326f9ecdfd5e0466f00789246a5280f4ba0fc \
--hash=sha256:73f49b5368b5359d04e18d15828eecc1806033db5233397748f4ca813ff1056c \
--hash=sha256:81701fd84f26330f0d5f4944d4e92e61afe6319dcd9775e39396e39d7c3e5f98 \
--hash=sha256:8854167e06950ca75b898b104b63cc646573aa5fef1353d4508ecdd1ee76254f \
--hash=sha256:8c68325b0d0acf8d91dde4e6f930967dd52a5302cd4062932a6b2e7c2969f47c \
--hash=sha256:94385f101946790ae13da500603491f04a76b6e4c059dab271b3ce2e283b2590 \
--hash=sha256:94abf90142c2a18151632371140b3dba4dee031633fe614cb592dbb6c9e17bc3 \
--hash=sha256:96378df1de302bc38e99c3a9aa311967b7dc80ced1dcc6f171e99842987882a2 \
--hash=sha256:9c40adce87eaa9ddb593ccb0fa6a07caf34015a29bf8d344811665b573138db9 \
--hash=sha256:9fe0a28a7b952a21e2c062cd5756d34354117796c6d9215a87f55e38d15402c5 \
--hash=sha256:a7d4e128405eea3814a12cc2605e0e6aedb4035bf32697f72deca74de4105e02 \
--hash=sha256:abbf57b5a870d30c4675928c37278493044d7c14378350b3aa5d484fa65575f0 \
--hash=sha256:b4a1870c51720687af7fa3e7cda6d08d801dae660f75a76f3845b642b4da6ee1 \
--hash=sha256:b6a7c19cf0d2742d0809a4c05975db036fdff50cd294a93632d6a310bf9ac02c \
--hash=sha256:b90654e092f928f110e0007f572007c9727b5265f7632c2fa7415b4689351594 \
--hash=sha256:c17b6b34111ea72fc5a4e4beec9711d2226285f0386ea83477cbb97c30a3f3a5 \
--hash=sha256:c2ca18a03a8cfb5b25bc1cbe20f3d9a4c80d8c3b13ba3df49ac3961af0b1018d \
--hash=sha256:c5111ccdc9c88f423426df3fd1811bfc40ed66264d35aa373420a34377efc98a \
--hash=sha256:c60a6d84229b271d44b70fb6e5fa23781abb5d742af7b808ae3f6efd7c9c60f6 \
--hash=sha256:c8c9e331e58180d0d83c5b7999255721b725913ff6bc6cf39fa2a45841a4fd4b \
--hash=sha256:c9913f1a30e4526f432991f89ae263459b1c64d1608c0d22a5c79c287b3c70df \
--hash=sha256:cd3c8e693bff0fff6ba55f140bf390fa92c994083f838fece0f63be121334945 \
--hash=sha256:d25c5091190f2dc0eaa3f950252122edbbadbb682aa7b1ef2f8af0f8c0afefae \
--hash=sha256:d2e685ade4dafd447ede19c31277a224a239a0a1a4eca4e6390efedf20260cfb \
--hash=sha256:d76383238584e9711e20ebe14db6c88ddcedc1829a9ad31a584389463b5aa504 \
--hash=sha256:ddf9164e7a5b08e9d22511526865780a576f19ddd00d62f8a665949327fde8bb \
--hash=sha256:e37ab26028f12dbb0ff65f29a8d3d44a765c61e729647bf2ddfbbed621726f01 \
--hash=sha256:f10fd42b5ee276335863712fa3da6608e93f70629c631bf77145021600abc23c \
--hash=sha256:f28588772bb5fb869a8eb331374ec06f24a83a9c25bfa1f38b6993afe9c1e968
# via sqlalchemy
h11==0.16.0 \
--hash=sha256:4e35b956cf45792e4caa5885e69fba00bdbc6ffafbfa020300e549b208ee5ff1 \
--hash=sha256:63cf8bbe7522de3bf65932fda1d9c2772064ffb3dae62d55932da54b31cb6c86
# via
# httpcore
# uvicorn
httpcore==1.0.9 \
--hash=sha256:2d400746a40668fc9dec9810239072b40b4484b640a8c38fd654a024c7a1bf55 \
--hash=sha256:6e34463af53fd2ab5d807f399a9b45ea31c3dfa2276f15a2c3f00afff6e176e8
# via httpx
httpx==0.28.1 \
--hash=sha256:75e98c5f16b0f35b567856f597f06ff2270a374470a5c2392242528e3e3e42fc \
--hash=sha256:d909fcccc110f8c7faf814ca82a9a4d816bc5a6dbfea25d6591d6985b8ba59ad
# via -r requirements.in
idna==3.10 \
--hash=sha256:12f65c9b470abda6dc35cf8e63cc574b1c52b11df2c86030af0ac09b01b13ea9 \
--hash=sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3
# via
# anyio
# httpx
pycparser==2.23 \
--hash=sha256:78816d4f24add8f10a06d6f05b4d424ad9e96cfebf68a4ddc99c65c0720d00c2 \
--hash=sha256:e5c6e8d3fbad53479cab09ac03729e0a9faf2bee3db8208a550daf5af81a5934
# via cffi
pydantic==2.11.7 \
--hash=sha256:d989c3c6cb79469287b1569f7447a17848c998458d49ebe294e975b9baf0f0db \
--hash=sha256:dde5df002701f6de26248661f6835bbe296a47bf73990135c7d07ce741b9623b
# via
# fastapi
# pydantic-settings
# sqlmodel
pydantic-core==2.33.2 \
--hash=sha256:0069c9acc3f3981b9ff4cdfaf088e98d83440a4c7ea1bc07460af3d4dc22e72d \
--hash=sha256:031c57d67ca86902726e0fae2214ce6770bbe2f710dc33063187a68744a5ecac \
--hash=sha256:0405262705a123b7ce9f0b92f123334d67b70fd1f20a9372b907ce1080c7ba02 \
--hash=sha256:04a1a413977ab517154eebb2d326da71638271477d6ad87a769102f7c2488c56 \
--hash=sha256:09fb9dd6571aacd023fe6aaca316bd01cf60ab27240d7eb39ebd66a3a15293b4 \
--hash=sha256:0a39979dcbb70998b0e505fb1556a1d550a0781463ce84ebf915ba293ccb7e22 \
--hash=sha256:0a9f2c9dd19656823cb8250b0724ee9c60a82f3cdf68a080979d13092a3b0fef \
--hash=sha256:0e03262ab796d986f978f79c943fc5f620381be7287148b8010b4097f79a39ec \
--hash=sha256:0e5b2671f05ba48b94cb90ce55d8bdcaaedb8ba00cc5359f6810fc918713983d \
--hash=sha256:0e6116757f7959a712db11f3e9c0a99ade00a5bbedae83cb801985aa154f071b \
--hash=sha256:0fb2d542b4d66f9470e8065c5469ec676978d625a8b7a363f07d9a501a9cb36a \
--hash=sha256:1082dd3e2d7109ad8b7da48e1d4710c8d06c253cbc4a27c1cff4fbcaa97a9e3f \
--hash=sha256:1a8695a8d00c73e50bff9dfda4d540b7dee29ff9b8053e38380426a85ef10052 \
--hash=sha256:1e063337ef9e9820c77acc768546325ebe04ee38b08703244c1309cccc4f1bab \
--hash=sha256:1ea40a64d23faa25e62a70ad163571c0b342b8bf66d5fa612ac0dec4f069d916 \
--hash=sha256:2058a32994f1fde4ca0480ab9d1e75a0e8c87c22b53a3ae66554f9af78f2fe8c \
--hash=sha256:235f45e5dbcccf6bd99f9f472858849f73d11120d76ea8707115415f8e5ebebf \
--hash=sha256:2807668ba86cb38c6817ad9bc66215ab8584d1d304030ce4f0887336f28a5e27 \
--hash=sha256:2b0a451c263b01acebe51895bfb0e1cc842a5c666efe06cdf13846c7418caa9a \
--hash=sha256:2b3d326aaef0c0399d9afffeb6367d5e26ddc24d351dbc9c636840ac355dc5d8 \
--hash=sha256:2bfb5112df54209d820d7bf9317c7a6c9025ea52e49f46b6a2060104bba37de7 \
--hash=sha256:2f82865531efd18d6e07a04a17331af02cb7a651583c418df8266f17a63c6612 \
--hash=sha256:329467cecfb529c925cf2bbd4d60d2c509bc2fb52a20c1045bf09bb70971a9c1 \
--hash=sha256:3a1c81334778f9e3af2f8aeb7a960736e5cab1dfebfb26aabca09afd2906c039 \
--hash=sha256:3abcd9392a36025e3bd55f9bd38d908bd17962cc49bc6da8e7e96285336e2bca \
--hash=sha256:3c6db6e52c6d70aa0d00d45cdb9b40f0433b96380071ea80b09277dba021ddf7 \
--hash=sha256:3dc625f4aa79713512d1976fe9f0bc99f706a9dee21dfd1810b4bbbf228d0e8a \
--hash=sha256:3eb3fe62804e8f859c49ed20a8451342de53ed764150cb14ca71357c765dc2a6 \
--hash=sha256:44857c3227d3fb5e753d5fe4a3420d6376fa594b07b621e220cd93703fe21782 \
--hash=sha256:4b25d91e288e2c4e0662b8038a28c6a07eaac3e196cfc4ff69de4ea3db992a1b \
--hash=sha256:4c5b0a576fb381edd6d27f0a85915c6daf2f8138dc5c267a57c08a62900758c7 \
--hash=sha256:4e61206137cbc65e6d5256e1166f88331d3b6238e082d9f74613b9b765fb9025 \
--hash=sha256:52fb90784e0a242bb96ec53f42196a17278855b0f31ac7c3cc6f5c1ec4811849 \
--hash=sha256:53a57d2ed685940a504248187d5685e49eb5eef0f696853647bf37c418c538f7 \
--hash=sha256:572c7e6c8bb4774d2ac88929e3d1f12bc45714ae5ee6d9a788a9fb35e60bb04b \
--hash=sha256:5c4aa4e82353f65e548c476b37e64189783aa5384903bfea4f41580f255fddfa \
--hash=sha256:5c92edd15cd58b3c2d34873597a1e20f13094f59cf88068adb18947df5455b4e \
--hash=sha256:5f483cfb75ff703095c59e365360cb73e00185e01aaea067cd19acffd2ab20ea \
--hash=sha256:61c18fba8e5e9db3ab908620af374db0ac1baa69f0f32df4f61ae23f15e586ac \
--hash=sha256:6368900c2d3ef09b69cb0b913f9f8263b03786e5b2a387706c5afb66800efd51 \
--hash=sha256:64632ff9d614e5eecfb495796ad51b0ed98c453e447a76bcbeeb69615079fc7e \
--hash=sha256:65132b7b4a1c0beded5e057324b7e16e10910c106d43675d9bd87d4f38dde162 \
--hash=sha256:6b99022f1d19bc32a4c2a0d544fc9a76e3be90f0b3f4af413f87d38749300e65 \
--hash=sha256:6bdfe4b3789761f3bcb4b1ddf33355a71079858958e3a552f16d5af19768fef2 \
--hash=sha256:6fa6dfc3e4d1f734a34710f391ae822e0a8eb8559a85c6979e14e65ee6ba2954 \
--hash=sha256:73662edf539e72a9440129f231ed3757faab89630d291b784ca99237fb94db2b \
--hash=sha256:73cf6373c21bc80b2e0dc88444f41ae60b2f070ed02095754eb5a01df12256de \
--hash=sha256:7cb8bc3605c29176e1b105350d2e6474142d7c1bd1d9327c4a9bdb46bf827acc \
--hash=sha256:7f92c15cd1e97d4b12acd1cc9004fa092578acfa57b67ad5e43a197175d01a64 \
--hash=sha256:82f68293f055f51b51ea42fafc74b6aad03e70e191799430b90c13d643059ebb \
--hash=sha256:83aa99b1285bc8f038941ddf598501a86f1536789740991d7d8756e34f1e74d9 \
--hash=sha256:87acbfcf8e90ca885206e98359d7dca4bcbb35abdc0ff66672a293e1d7a19101 \
--hash=sha256:87b31b6846e361ef83fedb187bb5b4372d0da3f7e28d85415efa92d6125d6e6d \
--hash=sha256:881b21b5549499972441da4758d662aeea93f1923f953e9cbaff14b8b9565aef \
--hash=sha256:8d55ab81c57b8ff8548c3e4947f119551253f4e3787a7bbc0b6b3ca47498a9d3 \
--hash=sha256:8f57a69461af2a5fa6e6bbd7a5f60d3b7e6cebb687f55106933188e79ad155c1 \
--hash=sha256:95237e53bb015f67b63c91af7518a62a8660376a6a0db19b89acc77a4d6199f5 \
--hash=sha256:96081f1605125ba0855dfda83f6f3df5ec90c61195421ba72223de35ccfb2f88 \
--hash=sha256:970919794d126ba8645f3837ab6046fb4e72bbc057b3709144066204c19a455d \
--hash=sha256:9cb1da0f5a471435a7bc7e439b8a728e8b61e59784b2af70d7c169f8dd8ae290 \
--hash=sha256:9fcd347d2cc5c23b06de6d3b7b8275be558a0c90549495c699e379a80bf8379e \
--hash=sha256:9fdac5d6ffa1b5a83bca06ffe7583f5576555e6c8b3a91fbd25ea7780f825f7d \
--hash=sha256:a11c8d26a50bfab49002947d3d237abe4d9e4b5bdc8846a63537b6488e197808 \
--hash=sha256:a144d4f717285c6d9234a66778059f33a89096dfb9b39117663fd8413d582dcc \
--hash=sha256:a2b911a5b90e0374d03813674bf0a5fbbb7741570dcd4b4e85a2e48d17def29d \
--hash=sha256:a7ec89dc587667f22b6a0b6579c249fca9026ce7c333fc142ba42411fa243cdc \
--hash=sha256:aa9d91b338f2df0508606f7009fde642391425189bba6d8c653afd80fd6bb64e \
--hash=sha256:b0379a2b24882fef529ec3b4987cb5d003b9cda32256024e6fe1586ac45fc640 \
--hash=sha256:bc7aee6f634a6f4a95676fcb5d6559a2c2a390330098dba5e5a5f28a2e4ada30 \
--hash=sha256:bdc25f3681f7b78572699569514036afe3c243bc3059d3942624e936ec93450e \
--hash=sha256:c083a3bdd5a93dfe480f1125926afcdbf2917ae714bdb80b36d34318b2bec5d9 \
--hash=sha256:c20c462aa4434b33a2661701b861604913f912254e441ab8d78d30485736115a \
--hash=sha256:c2fc0a768ef76c15ab9238afa6da7f69895bb5d1ee83aeea2e3509af4472d0b9 \
--hash=sha256:c52b02ad8b4e2cf14ca7b3d918f3eb0ee91e63b3167c32591e57c4317e134f8f \
--hash=sha256:c54c939ee22dc8e2d545da79fc5381f1c020d6d3141d3bd747eab59164dc89fb \
--hash=sha256:c8e7af2f4e0194c22b5b37205bfb293d166a7344a5b0d0eaccebc376546d77d5 \
--hash=sha256:cca3868ddfaccfbc4bfb1d608e2ccaaebe0ae628e1416aeb9c4d88c001bb45ab \
--hash=sha256:d3f26877a748dc4251cfcfda9dfb5f13fcb034f5308388066bcfe9031b63ae7d \
--hash=sha256:d53b22f2032c42eaaf025f7c40c2e3b94568ae077a606f006d206a463bc69572 \
--hash=sha256:d87c561733f66531dced0da6e864f44ebf89a8fba55f31407b00c2f7f9449593 \
--hash=sha256:d946c8bf0d5c24bf4fe333af284c59a19358aa3ec18cb3dc4370080da1e8ad29 \
--hash=sha256:dac89aea9af8cd672fa7b510e7b8c33b0bba9a43186680550ccf23020f32d535 \
--hash=sha256:db4b41f9bd95fbe5acd76d89920336ba96f03e149097365afe1cb092fceb89a1 \
--hash=sha256:dc46a01bf8d62f227d5ecee74178ffc448ff4e5197c756331f71efcc66dc980f \
--hash=sha256:dd14041875d09cc0f9308e37a6f8b65f5585cf2598a53aa0123df8b129d481f8 \
--hash=sha256:de4b83bb311557e439b9e186f733f6c645b9417c84e2eb8203f3f820a4b988bf \
--hash=sha256:e799c050df38a639db758c617ec771fd8fb7a5f8eaaa4b27b101f266b216a246 \
--hash=sha256:e80b087132752f6b3d714f041ccf74403799d3b23a72722ea2e6ba2e892555b9 \
--hash=sha256:eb8c529b2819c37140eb51b914153063d27ed88e3bdc31b71198a198e921e011 \
--hash=sha256:eb9b459ca4df0e5c87deb59d37377461a538852765293f9e6ee834f0435a93b9 \
--hash=sha256:efec8db3266b76ef9607c2c4c419bdb06bf335ae433b80816089ea7585816f6a \
--hash=sha256:f481959862f57f29601ccced557cc2e817bce7533ab8e01a797a48b49c9692b3 \
--hash=sha256:f517ca031dfc037a9c07e748cefd8d96235088b83b4f4ba8939105d20fa1dcd6 \
--hash=sha256:f889f7a40498cc077332c7ab6b4608d296d852182211787d4f3ee377aaae66e8 \
--hash=sha256:f8de619080e944347f5f20de29a975c2d815d9ddd8be9b9b7268e2e3ef68605a \
--hash=sha256:f941635f2a3d96b2973e867144fde513665c87f13fe0e193c158ac51bfaaa7b2 \
--hash=sha256:fa754d1850735a0b0e03bcffd9d4b4343eb417e47196e4485d9cca326073a42c \
--hash=sha256:fa854f5cf7e33842a892e5c73f45327760bc7bc516339fda888c75ae60edaeb6 \
--hash=sha256:fe5b32187cbc0c862ee201ad66c30cf218e5ed468ec8dc1cf49dec66e160cc4d
# via pydantic
pydantic-settings==2.10.1 \
--hash=sha256:06f0062169818d0f5524420a360d632d5857b83cffd4d42fe29597807a1614ee \
--hash=sha256:a60952460b99cf661dc25c29c0ef171721f98bfcb52ef8d9ea4c943d7c8cc796
# via -r requirements.in
python-dotenv==1.1.1 \
--hash=sha256:31f23644fe2602f88ff55e1f5c79ba497e01224ee7737937930c448e4d0e24dc \
--hash=sha256:a8a6399716257f45be6a007360200409fce5cda2661e3dec71d23dc15f6189ab
# via pydantic-settings
pyyaml==6.0.2 \
--hash=sha256:01179a4a8559ab5de078078f37e5c1a30d76bb88519906844fd7bdea1b7729ff \
--hash=sha256:0833f8694549e586547b576dcfaba4a6b55b9e96098b36cdc7ebefe667dfed48 \
--hash=sha256:0a9a2848a5b7feac301353437eb7d5957887edbf81d56e903999a75a3d743086 \
--hash=sha256:0b69e4ce7a131fe56b7e4d770c67429700908fc0752af059838b1cfb41960e4e \
--hash=sha256:0ffe8360bab4910ef1b9e87fb812d8bc0a308b0d0eef8c8f44e0254ab3b07133 \
--hash=sha256:11d8f3dd2b9c1207dcaf2ee0bbbfd5991f571186ec9cc78427ba5bd32afae4b5 \
--hash=sha256:17e311b6c678207928d649faa7cb0d7b4c26a0ba73d41e99c4fff6b6c3276484 \
--hash=sha256:1e2120ef853f59c7419231f3bf4e7021f1b936f6ebd222406c3b60212205d2ee \
--hash=sha256:1f71ea527786de97d1a0cc0eacd1defc0985dcf6b3f17bb77dcfc8c34bec4dc5 \
--hash=sha256:23502f431948090f597378482b4812b0caae32c22213aecf3b55325e049a6c68 \
--hash=sha256:24471b829b3bf607e04e88d79542a9d48bb037c2267d7927a874e6c205ca7e9a \
--hash=sha256:29717114e51c84ddfba879543fb232a6ed60086602313ca38cce623c1d62cfbf \
--hash=sha256:2e99c6826ffa974fe6e27cdb5ed0021786b03fc98e5ee3c5bfe1fd5015f42b99 \
--hash=sha256:39693e1f8320ae4f43943590b49779ffb98acb81f788220ea932a6b6c51004d8 \
--hash=sha256:3ad2a3decf9aaba3d29c8f537ac4b243e36bef957511b4766cb0057d32b0be85 \
--hash=sha256:3b1fdb9dc17f5a7677423d508ab4f243a726dea51fa5e70992e59a7411c89d19 \
--hash=sha256:41e4e3953a79407c794916fa277a82531dd93aad34e29c2a514c2c0c5fe971cc \
--hash=sha256:43fa96a3ca0d6b1812e01ced1044a003533c47f6ee8aca31724f78e93ccc089a \
--hash=sha256:50187695423ffe49e2deacb8cd10510bc361faac997de9efef88badc3bb9e2d1 \
--hash=sha256:5ac9328ec4831237bec75defaf839f7d4564be1e6b25ac710bd1a96321cc8317 \
--hash=sha256:5d225db5a45f21e78dd9358e58a98702a0302f2659a3c6cd320564b75b86f47c \
--hash=sha256:6395c297d42274772abc367baaa79683958044e5d3835486c16da75d2a694631 \
--hash=sha256:688ba32a1cffef67fd2e9398a2efebaea461578b0923624778664cc1c914db5d \
--hash=sha256:68ccc6023a3400877818152ad9a1033e3db8625d899c72eacb5a668902e4d652 \
--hash=sha256:70b189594dbe54f75ab3a1acec5f1e3faa7e8cf2f1e08d9b561cb41b845f69d5 \
--hash=sha256:797b4f722ffa07cc8d62053e4cff1486fa6dc094105d13fea7b1de7d8bf71c9e \
--hash=sha256:7c36280e6fb8385e520936c3cb3b8042851904eba0e58d277dca80a5cfed590b \
--hash=sha256:7e7401d0de89a9a855c839bc697c079a4af81cf878373abd7dc625847d25cbd8 \
--hash=sha256:80bab7bfc629882493af4aa31a4cfa43a4c57c83813253626916b8c7ada83476 \
--hash=sha256:82d09873e40955485746739bcb8b4586983670466c23382c19cffecbf1fd8706 \
--hash=sha256:8388ee1976c416731879ac16da0aff3f63b286ffdd57cdeb95f3f2e085687563 \
--hash=sha256:8824b5a04a04a047e72eea5cec3bc266db09e35de6bdfe34c9436ac5ee27d237 \
--hash=sha256:8b9c7197f7cb2738065c481a0461e50ad02f18c78cd75775628afb4d7137fb3b \
--hash=sha256:9056c1ecd25795207ad294bcf39f2db3d845767be0ea6e6a34d856f006006083 \
--hash=sha256:936d68689298c36b53b29f23c6dbb74de12b4ac12ca6cfe0e047bedceea56180 \
--hash=sha256:9b22676e8097e9e22e36d6b7bda33190d0d400f345f23d4065d48f4ca7ae0425 \
--hash=sha256:a4d3091415f010369ae4ed1fc6b79def9416358877534caf6a0fdd2146c87a3e \
--hash=sha256:a8786accb172bd8afb8be14490a16625cbc387036876ab6ba70912730faf8e1f \
--hash=sha256:a9f8c2e67970f13b16084e04f134610fd1d374bf477b17ec1599185cf611d725 \
--hash=sha256:bc2fa7c6b47d6bc618dd7fb02ef6fdedb1090ec036abab80d4681424b84c1183 \
--hash=sha256:c70c95198c015b85feafc136515252a261a84561b7b1d51e3384e0655ddf25ab \
--hash=sha256:cc1c1159b3d456576af7a3e4d1ba7e6924cb39de8f67111c735f6fc832082774 \
--hash=sha256:ce826d6ef20b1bc864f0a68340c8b3287705cae2f8b4b1d932177dcc76721725 \
--hash=sha256:d584d9ec91ad65861cc08d42e834324ef890a082e591037abe114850ff7bbc3e \
--hash=sha256:d7fded462629cfa4b685c5416b949ebad6cec74af5e2d42905d41e257e0869f5 \
--hash=sha256:d84a1718ee396f54f3a086ea0a66d8e552b2ab2017ef8b420e92edbc841c352d \
--hash=sha256:d8e03406cac8513435335dbab54c0d385e4a49e4945d2909a581c83647ca0290 \
--hash=sha256:e10ce637b18caea04431ce14fabcf5c64a1c61ec9c56b071a4b7ca131ca52d44 \
--hash=sha256:ec031d5d2feb36d1d1a24380e4db6d43695f3748343d99434e6f5f9156aaa2ed \
--hash=sha256:ef6107725bd54b262d6dedcc2af448a266975032bc85ef0172c5f059da6325b4 \
--hash=sha256:efdca5630322a10774e8e98e1af481aad470dd62c3170801852d752aa7a783ba \
--hash=sha256:f753120cb8181e736c57ef7636e83f31b9c0d1722c516f7e86cf15b7aa57ff12 \
--hash=sha256:ff3824dc5261f50c9b0dfb3be22b4567a6f938ccce4587b38952d85fd9e9afe4
# via -r requirements.in
sniffio==1.3.1 \
--hash=sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2 \
--hash=sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc
# via anyio
sqlalchemy==2.0.43 \
--hash=sha256:022e436a1cb39b13756cf93b48ecce7aa95382b9cfacceb80a7d263129dfd019 \
--hash=sha256:03d73ab2a37d9e40dec4984d1813d7878e01dbdc742448d44a7341b7a9f408c7 \
--hash=sha256:07097c0a1886c150ef2adba2ff7437e84d40c0f7dcb44a2c2b9c905ccfc6361c \
--hash=sha256:11b9503fa6f8721bef9b8567730f664c5a5153d25e247aadc69247c4bc605227 \
--hash=sha256:11f43c39b4b2ec755573952bbcc58d976779d482f6f832d7f33a8d869ae891bf \
--hash=sha256:13194276e69bb2af56198fef7909d48fd34820de01d9c92711a5fa45497cc7ed \
--hash=sha256:136063a68644eca9339d02e6693932116f6a8591ac013b0014479a1de664e40a \
--hash=sha256:14111d22c29efad445cd5021a70a8b42f7d9152d8ba7f73304c4d82460946aaa \
--hash=sha256:1681c21dd2ccee222c2fe0bef671d1aef7c504087c9c4e800371cfcc8ac966fc \
--hash=sha256:1a113da919c25f7f641ffbd07fbc9077abd4b3b75097c888ab818f962707eb48 \
--hash=sha256:1c6d85327ca688dbae7e2b06d7d84cfe4f3fffa5b5f9e21bb6ce9d0e1a0e0e0a \
--hash=sha256:20d81fc2736509d7a2bd33292e489b056cbae543661bb7de7ce9f1c0cd6e7f24 \
--hash=sha256:21b27b56eb2f82653168cefe6cb8e970cdaf4f3a6cb2c5e3c3c1cf3158968ff9 \
--hash=sha256:21ba7a08a4253c5825d1db389d4299f64a100ef9800e4624c8bf70d8f136e6ed \
--hash=sha256:227119ce0a89e762ecd882dc661e0aa677a690c914e358f0dd8932a2e8b2765b \
--hash=sha256:25b9fc27650ff5a2c9d490c13c14906b918b0de1f8fcbb4c992712d8caf40e83 \
--hash=sha256:334f41fa28de9f9be4b78445e68530da3c5fa054c907176460c81494f4ae1f5e \
--hash=sha256:413391b2239db55be14fa4223034d7e13325a1812c8396ecd4f2c08696d5ccad \
--hash=sha256:4286a1139f14b7d70141c67a8ae1582fc2b69105f1b09d9573494eb4bb4b2687 \
--hash=sha256:44337823462291f17f994d64282a71c51d738fc9ef561bf265f1d0fd9116a782 \
--hash=sha256:46293c39252f93ea0910aababa8752ad628bcce3a10d3f260648dd472256983f \
--hash=sha256:4bf0edb24c128b7be0c61cd17eef432e4bef507013292415f3fb7023f02b7d4b \
--hash=sha256:4d3d9b904ad4a6b175a2de0738248822f5ac410f52c2fd389ada0b5262d6a1e3 \
--hash=sha256:4e6aeb2e0932f32950cf56a8b4813cb15ff792fc0c9b3752eaf067cfe298496a \
--hash=sha256:4fb1a8c5438e0c5ea51afe9c6564f951525795cf432bed0c028c1cb081276685 \
--hash=sha256:529064085be2f4d8a6e5fab12d36ad44f1909a18848fcfbdb59cc6d4bbe48efe \
--hash=sha256:52d9b73b8fb3e9da34c2b31e6d99d60f5f99fd8c1225c9dad24aeb74a91e1d29 \
--hash=sha256:5cda6b51faff2639296e276591808c1726c4a77929cfaa0f514f30a5f6156921 \
--hash=sha256:5d79f9fdc9584ec83d1b3c75e9f4595c49017f5594fee1a2217117647225d738 \
--hash=sha256:61f964a05356f4bca4112e6334ed7c208174511bd56e6b8fc86dad4d024d4185 \
--hash=sha256:6772e3ca8a43a65a37c88e2f3e2adfd511b0b1da37ef11ed78dea16aeae85bd9 \
--hash=sha256:6e2bf13d9256398d037fef09fd8bf9b0bf77876e22647d10761d35593b9ac547 \
--hash=sha256:70322986c0c699dca241418fcf18e637a4369e0ec50540a2b907b184c8bca069 \
--hash=sha256:788bfcef6787a7764169cfe9859fe425bf44559619e1d9f56f5bddf2ebf6f417 \
--hash=sha256:7f1ac7828857fcedb0361b48b9ac4821469f7694089d15550bbcf9ab22564a1d \
--hash=sha256:87accdbba88f33efa7b592dc2e8b2a9c2cdbca73db2f9d5c510790428c09c154 \
--hash=sha256:8cee08f15d9e238ede42e9bbc1d6e7158d0ca4f176e4eab21f88ac819ae3bd7b \
--hash=sha256:971ba928fcde01869361f504fcff3b7143b47d30de188b11c6357c0505824197 \
--hash=sha256:9c2e02f06c68092b875d5cbe4824238ab93a7fa35d9c38052c033f7ca45daa18 \
--hash=sha256:9c5a9da957c56e43d72126a3f5845603da00e0293720b03bde0aacffcf2dc04f \
--hash=sha256:9df7126fd9db49e3a5a3999442cc67e9ee8971f3cb9644250107d7296cb2a164 \
--hash=sha256:b3edaec7e8b6dc5cd94523c6df4f294014df67097c8217a89929c99975811414 \
--hash=sha256:b535d35dea8bbb8195e7e2b40059e2253acb2b7579b73c1b432a35363694641d \
--hash=sha256:bcf0724a62a5670e5718957e05c56ec2d6850267ea859f8ad2481838f889b42c \
--hash=sha256:c00e7845d2f692ebfc7d5e4ec1a3fd87698e4337d09e58d6749a16aedfdf8612 \
--hash=sha256:c379e37b08c6c527181a397212346be39319fb64323741d23e46abd97a400d34 \
--hash=sha256:c5d1730b25d9a07727d20ad74bc1039bbbb0a6ca24e6769861c1aa5bf2c4c4a8 \
--hash=sha256:c5e73ba0d76eefc82ec0219d2301cb33bfe5205ed7a2602523111e2e56ccbd20 \
--hash=sha256:c697575d0e2b0a5f0433f679bda22f63873821d991e95a90e9e52aae517b2e32 \
--hash=sha256:cdeff998cb294896a34e5b2f00e383e7c5c4ef3b4bfa375d9104723f15186443 \
--hash=sha256:ceb5c832cc30663aeaf5e39657712f4c4241ad1f638d487ef7216258f6d41fe7 \
--hash=sha256:d34c0f6dbefd2e816e8f341d0df7d4763d382e3f452423e752ffd1e213da2512 \
--hash=sha256:db691fa174e8f7036afefe3061bc40ac2b770718be2862bfb03aabae09051aca \
--hash=sha256:e7a903b5b45b0d9fa03ac6a331e1c1d6b7e0ab41c63b6217b3d10357b83c8b00 \
--hash=sha256:e7c08f57f75a2bb62d7ee80a89686a5e5669f199235c6d1dac75cd59374091c3 \
--hash=sha256:f42f23e152e4545157fa367b2435a1ace7571cab016ca26038867eb7df2c3631 \
--hash=sha256:fe2b3b4927d0bc03d02ad883f402d5de201dbc8894ac87d2e981e7d87430e60d
# via sqlmodel
sqlmodel==0.0.24 \
--hash=sha256:6778852f09370908985b667d6a3ab92910d0d5ec88adcaf23dbc242715ff7193 \
--hash=sha256:cc5c7613c1a5533c9c7867e1aab2fd489a76c9e8a061984da11b4e613c182423
# via -r requirements.in
starlette==0.47.3 \
--hash=sha256:6bc94f839cc176c4858894f1f8908f0ab79dfec1a6b8402f6da9be26ebea52e9 \
--hash=sha256:89c0778ca62a76b826101e7c709e70680a1699ca7da6b44d38eb0a7e61fe4b51
# via fastapi
typing-extensions==4.15.0 \
--hash=sha256:0cea48d173cc12fa28ecabc3b837ea3cf6f38c6d1136f85cbaaf598984861466 \
--hash=sha256:f0fa19c6845758ab08074a0cfa8b7aecb71c999ca73d62883bc25cc018c4e548
# via
# anyio
# fastapi
# pydantic
# pydantic-core
# sqlalchemy
# starlette
# typing-inspection
typing-inspection==0.4.1 \
--hash=sha256:389055682238f53b04f7badcb49b989835495a96700ced5dab2d8feae4b26f51 \
--hash=sha256:6ae134cc0203c33377d43188d4064e9b357dba58cff3185f22924610e70a9d28
# via
# pydantic
# pydantic-settings
uvicorn==0.35.0 \
--hash=sha256:197535216b25ff9b785e29a0b79199f55222193d47f820816e7da751e9bc8d4a \
--hash=sha256:bc662f087f7cf2ce11a1d7fd70b90c9f98ef2e2831556dd078d131b96cc94a01
# via -r requirements.in

View File

@@ -4,7 +4,23 @@ line-length = 144
[lint]
select = ["ALL"]
fixable = ["UP034", "I001"]
ignore = ["T201", "D", "ANN101", "TD002", "TD003"]
ignore = [
"T201",
"D",
"ANN101",
"TD002",
"TD003",
"TRY003",
"EM101",
"EM102",
"SIM108",
"C901",
"PLR0912",
"PLR0915",
"PLR0913",
"PLC0415",
]
[lint.extend-per-file-ignores]
"test*.py" = ["S101"]
"test*.py" = ["S101", "S105", "S106", "PT011", "PLR2004"]
"models*.py" = ["FA102"]

View File

@@ -1,3 +1,5 @@
from __future__ import annotations
import os
from pathlib import Path
from typing import Any
@@ -12,6 +14,10 @@ class Settings(BaseSettings):
port: int = 8000
workers: int = 1
log_level: str = "info"
database_url: str = "sqlite:///:memory:"
api_base: str = "/api/v1"
session_expiry_seconds: int = 3600 * 24 * 7 # 7 days
hmac_key: str | None = None
model_config = ConfigDict(env_file=".env", env_file_encoding="utf-8")

View File

@@ -0,0 +1,56 @@
curl --location '127.0.0.1:18881/api/v1/trades' \
--header 'Content-Type: application/json' \
--header 'Cookie: session_token=uYsEZZdH9ecQ432HQUdfab292I14suk4GuI12-cAyuw' \
--data '{
"friendly_name": "20250908-CA-PUT",
"symbol": "CA",
"exchange_id": 1,
"underlying_currency": "EUR",
"trade_type": "SELL_PUT",
"trade_strategy": "WHEEL",
"trade_date": "2025-09-08",
"quantity": 1,
"quantity_multiplier": 100,
"price_cents": 17,
"expiry_date": "2025-09-09",
"strike_price_cents": 1220,
"commission_cents": 114
}'
curl --location '127.0.0.1:18881/api/v1/trades' \
--header 'Content-Type: application/json' \
--header 'Cookie: session_token=uYsEZZdH9ecQ432HQUdfab292I14suk4GuI12-cAyuw' \
--data '{
"friendly_name": "20250920-CA-ASSIGN",
"symbol": "CA",
"exchange_id": 1,
"cycle_id": 1,
"underlying_currency": "EUR",
"trade_type": "ASSIGNMENT",
"trade_strategy": "WHEEL",
"trade_date": "2025-09-20",
"quantity": 100,
"quantity_multiplier": 1,
"price_cents": 1220,
"commission_cents": 0
}'
curl --location '127.0.0.1:18881/api/v1/trades' \
--header 'Content-Type: application/json' \
--header 'Cookie: session_token=uYsEZZdH9ecQ432HQUdfab292I14suk4GuI12-cAyuw' \
--data '{
"friendly_name": "20250923-CA-CALL",
"symbol": "CA",
"exchange_id": 1,
"cycle_id": 1,
"underlying_currency": "EUR",
"trade_type": "SELL_CALL",
"trade_strategy": "WHEEL",
"trade_date": "2025-09-23",
"quantity": 1,
"quantity_multiplier": 100,
"price_cents": 31,
"expiry_date": "2025-10-10",
"strike_price_cents": 1200,
"commission_cents": 114
}'

405
backend/tests/test_app.py Normal file
View File

@@ -0,0 +1,405 @@
from collections.abc import Callable
from datetime import datetime, timedelta, timezone
from types import SimpleNamespace
from unittest.mock import MagicMock
import pytest
from fastapi import FastAPI, status
from fastapi.responses import JSONResponse
from fastapi.testclient import TestClient
import settings
import trading_journal.service as svc
@pytest.fixture
def client_factory(monkeypatch: pytest.MonkeyPatch) -> Callable[..., TestClient]:
class NoAuth:
def __init__(self, app: FastAPI, **opts) -> None: # noqa: ANN003, ARG002
self.app = app
async def __call__(self, scope, receive, send) -> None: # noqa: ANN001
state = scope.get("state")
if state is None:
scope["state"] = SimpleNamespace()
scope["state"]["user_id"] = 1
await self.app(scope, receive, send)
class DeclineAuth:
def __init__(self, app: FastAPI, **opts) -> None: # noqa: ANN003, ARG002
self.app = app
async def __call__(self, scope, receive, send) -> None: # noqa: ANN001
if scope.get("type") != "http":
await self.app(scope, receive, send)
return
path = scope.get("path", "")
# allow public/exempt paths through
if getattr(svc, "EXCEPT_PATHS", []) and path in svc.EXCEPT_PATHS:
await self.app(scope, receive, send)
return
# immediately respond 401 for protected paths
resp = JSONResponse({"detail": "Unauthorized"}, status_code=status.HTTP_401_UNAUTHORIZED)
await resp(scope, receive, send)
def _factory(*, decline_auth: bool = False, **mocks: dict) -> TestClient:
defaults = {
"register_user_service": MagicMock(return_value=SimpleNamespace(model_dump=lambda: {"id": 1, "username": "mock"})),
"authenticate_user_service": MagicMock(
return_value=(SimpleNamespace(user_id=1, expires_at=(datetime.now(timezone.utc) + timedelta(hours=1))), "token"),
),
"create_exchange_service": MagicMock(
return_value=SimpleNamespace(model_dump=lambda: {"name": "Binance", "notes": "some note", "user_id": 1}),
),
"get_exchanges_by_user_service": MagicMock(return_value=[]),
}
if decline_auth:
monkeypatch.setattr(svc, "AuthMiddleWare", DeclineAuth)
else:
monkeypatch.setattr(svc, "AuthMiddleWare", NoAuth)
merged = {**defaults, **mocks}
for name, mock in merged.items():
monkeypatch.setattr(svc, name, mock)
import sys
if "app" in sys.modules:
del sys.modules["app"]
from importlib import import_module
app = import_module("app").app # re-import app module
return TestClient(app)
return _factory
def test_get_status(client_factory: Callable[..., TestClient]) -> None:
client = client_factory()
with client as c:
response = c.get(f"{settings.settings.api_base}/status")
assert response.status_code == 200
assert response.json() == {"status": "ok"}
def test_register_success(client_factory: Callable[..., TestClient]) -> None:
client = client_factory() # use defaults
with client as c:
r = c.post(f"{settings.settings.api_base}/register", json={"username": "a", "password": "b"})
assert r.status_code == 201
def test_register_user_already_exists(client_factory: Callable[..., TestClient]) -> None:
client = client_factory(register_user_service=MagicMock(side_effect=svc.UserAlreadyExistsError("username already exists")))
with client as c:
r = c.post(f"{settings.settings.api_base}/register", json={"username": "a", "password": "b"})
assert r.status_code == status.HTTP_400_BAD_REQUEST
assert r.json() == {"detail": "username already exists"}
def test_register_user_internal_server_error(client_factory: Callable[..., TestClient]) -> None:
client = client_factory(register_user_service=MagicMock(side_effect=Exception("db is down")))
with client as c:
r = c.post(f"{settings.settings.api_base}/register", json={"username": "a", "password": "b"})
assert r.status_code == status.HTTP_500_INTERNAL_SERVER_ERROR
assert r.json() == {"detail": "Internal Server Error"}
def test_login_success(client_factory: Callable[..., TestClient]) -> None:
client = client_factory() # use defaults
with client as c:
r = c.post(f"{settings.settings.api_base}/login", json={"username": "a", "password": "b"})
assert r.status_code == 200
assert r.json() == {"user_id": 1}
assert r.cookies.get("session_token") == "token"
def test_login_failed_auth(client_factory: Callable[..., TestClient]) -> None:
client = client_factory(authenticate_user_service=MagicMock(return_value=None))
with client as c:
r = c.post(f"{settings.settings.api_base}/login", json={"username": "a", "password": "b"})
assert r.status_code == status.HTTP_401_UNAUTHORIZED
assert r.json() == {"detail": "Invalid username or password, or user doesn't exist"}
def test_login_internal_server_error(client_factory: Callable[..., TestClient]) -> None:
client = client_factory(authenticate_user_service=MagicMock(side_effect=Exception("db is down")))
with client as c:
r = c.post(f"{settings.settings.api_base}/login", json={"username": "a", "password": "b"})
assert r.status_code == status.HTTP_500_INTERNAL_SERVER_ERROR
assert r.json() == {"detail": "Internal Server Error"}
def test_create_exchange_success(client_factory: Callable[..., TestClient]) -> None:
client = client_factory()
with client as c:
r = c.post(f"{settings.settings.api_base}/exchanges", json={"name": "Binance"})
assert r.status_code == 201
assert r.json() == {"user_id": 1, "name": "Binance", "notes": "some note"}
def test_create_exchange_already_exists(client_factory: Callable[..., TestClient]) -> None:
client = client_factory(create_exchange_service=MagicMock(side_effect=svc.ExchangeAlreadyExistsError("exchange already exists")))
with client as c:
r = c.post(f"{settings.settings.api_base}/exchanges", json={"name": "Binance"})
assert r.status_code == status.HTTP_400_BAD_REQUEST
assert r.json() == {"detail": "exchange already exists"}
def test_get_exchanges_unauthenticated(client_factory: Callable[..., TestClient]) -> None:
client = client_factory(decline_auth=True)
with client as c:
r = c.get(f"{settings.settings.api_base}/exchanges")
assert r.status_code == status.HTTP_401_UNAUTHORIZED
assert r.json() == {"detail": "Unauthorized"}
def test_get_exchanges_success(client_factory: Callable[..., TestClient]) -> None:
client = client_factory()
with client as c:
r = c.get(f"{settings.settings.api_base}/exchanges")
assert r.status_code == 200
assert r.json() == []
def test_update_exchanges_success(client_factory: Callable[..., TestClient]) -> None:
client = client_factory(
update_exchanges_service=MagicMock(
return_value=SimpleNamespace(model_dump=lambda: {"user_id": 1, "name": "BinanceUS", "notes": "updated note"}),
),
)
with client as c:
r = c.patch(f"{settings.settings.api_base}/exchanges/1", json={"name": "BinanceUS", "notes": "updated note"})
assert r.status_code == 200
assert r.json() == {"user_id": 1, "name": "BinanceUS", "notes": "updated note"}
def test_update_exchanges_not_found(client_factory: Callable[..., TestClient]) -> None:
client = client_factory(update_exchanges_service=MagicMock(side_effect=svc.ExchangeNotFoundError("exchange not found")))
with client as c:
r = c.patch(f"{settings.settings.api_base}/exchanges/999", json={"name": "NonExistent", "notes": "no note"})
assert r.status_code == status.HTTP_404_NOT_FOUND
assert r.json() == {"detail": "exchange not found"}
def test_get_cycles_by_id_success(client_factory: Callable[..., TestClient]) -> None:
client = client_factory(
get_cycle_by_id_service=MagicMock(
return_value=SimpleNamespace(
friendly_name="Cycle 1",
status="active",
id=1,
),
),
)
with client as c:
r = c.get(f"{settings.settings.api_base}/cycles/1")
assert r.status_code == 200
assert r.json() == {"id": 1, "friendly_name": "Cycle 1", "status": "active"}
def test_get_cycles_by_id_not_found(client_factory: Callable[..., TestClient]) -> None:
client = client_factory(get_cycle_by_id_service=MagicMock(side_effect=svc.CycleNotFoundError("cycle not found")))
with client as c:
r = c.get(f"{settings.settings.api_base}/cycles/999")
assert r.status_code == status.HTTP_404_NOT_FOUND
assert r.json() == {"detail": "cycle not found"}
def test_get_cycles_by_user_success(client_factory: Callable[..., TestClient]) -> None:
client = client_factory(
get_cycles_by_user_service=MagicMock(
return_value=[
SimpleNamespace(
friendly_name="Cycle 1",
status="active",
id=1,
),
SimpleNamespace(
friendly_name="Cycle 2",
status="completed",
id=2,
),
],
),
)
with client as c:
r = c.get(f"{settings.settings.api_base}/cycles/user/1")
assert r.status_code == 200
assert r.json() == [
{"id": 1, "friendly_name": "Cycle 1", "status": "active"},
{"id": 2, "friendly_name": "Cycle 2", "status": "completed"},
]
def test_update_cycles_success(client_factory: Callable[..., TestClient]) -> None:
client = client_factory(
update_cycle_service=MagicMock(
return_value=SimpleNamespace(
friendly_name="Updated Cycle",
status="completed",
id=1,
),
),
)
with client as c:
r = c.patch(f"{settings.settings.api_base}/cycles", json={"friendly_name": "Updated Cycle", "status": "completed", "id": 1})
assert r.status_code == 200
assert r.json() == {"id": 1, "friendly_name": "Updated Cycle", "status": "completed"}
def test_update_cycles_invalid_cycle_data(client_factory: Callable[..., TestClient]) -> None:
client = client_factory(
update_cycle_service=MagicMock(side_effect=svc.InvalidCycleDataError("invalid cycle data")),
)
with client as c:
r = c.patch(f"{settings.settings.api_base}/cycles", json={"friendly_name": "", "status": "unknown", "id": 1})
assert r.status_code == status.HTTP_400_BAD_REQUEST
assert r.json() == {"detail": "invalid cycle data"}
def test_update_cycles_not_found(client_factory: Callable[..., TestClient]) -> None:
client = client_factory(update_cycle_service=MagicMock(side_effect=svc.CycleNotFoundError("cycle not found")))
with client as c:
r = c.patch(f"{settings.settings.api_base}/cycles", json={"friendly_name": "NonExistent", "status": "active", "id": 999})
assert r.status_code == status.HTTP_404_NOT_FOUND
assert r.json() == {"detail": "cycle not found"}
def test_create_trade_success(client_factory: Callable[..., TestClient]) -> None:
client = client_factory(
create_trade_service=MagicMock(
return_value=SimpleNamespace(),
),
)
with client as c:
r = c.post(
f"{settings.settings.api_base}/trades",
json={
"cycle_id": 1,
"exchange_id": 1,
"symbol": "BTCUSD",
"underlying_currency": "USD",
"trade_type": "LONG_SPOT",
"trade_strategy": "FX",
"quantity": 1,
"price_cents": 15,
"commission_cents": 100,
"trade_date": "2025-10-01",
},
)
assert r.status_code == 201
def test_create_trade_invalid_trade_data(client_factory: Callable[..., TestClient]) -> None:
client = client_factory(
create_trade_service=MagicMock(side_effect=svc.InvalidTradeDataError("invalid trade data")),
)
with client as c:
r = c.post(
f"{settings.settings.api_base}/trades",
json={
"cycle_id": 1,
"exchange_id": 1,
"symbol": "BTCUSD",
"underlying_currency": "USD",
"trade_type": "LONG_SPOT",
"trade_strategy": "FX",
"quantity": 1,
"price_cents": 15,
"commission_cents": 100,
"trade_date": "2025-10-01",
},
)
assert r.status_code == status.HTTP_400_BAD_REQUEST
assert r.json() == {"detail": "invalid trade data"}
def test_get_trade_by_id_success(client_factory: Callable[..., TestClient]) -> None:
client = client_factory(
get_trade_by_id_service=MagicMock(
return_value=SimpleNamespace(
id=1,
cycle_id=1,
exchange_id=1,
symbol="BTCUSD",
underlying_currency="USD",
trade_type="LONG_SPOT",
trade_strategy="FX",
quantity=1,
price_cents=1500,
commission_cents=100,
trade_date=datetime(2025, 10, 1, tzinfo=timezone.utc),
),
),
)
with client as c:
r = c.get(f"{settings.settings.api_base}/trades/1")
assert r.status_code == 200
assert r.json() == {
"id": 1,
"cycle_id": 1,
"exchange_id": 1,
"symbol": "BTCUSD",
"underlying_currency": "USD",
"trade_type": "LONG_SPOT",
"trade_strategy": "FX",
"quantity": 1,
"price_cents": 1500,
"commission_cents": 100,
"trade_date": "2025-10-01T00:00:00+00:00",
}
def test_get_trade_by_id_not_found(client_factory: Callable[..., TestClient]) -> None:
client = client_factory(get_trade_by_id_service=MagicMock(side_effect=svc.TradeNotFoundError("trade not found")))
with client as c:
r = c.get(f"{settings.settings.api_base}/trades/999")
assert r.status_code == status.HTTP_404_NOT_FOUND
assert r.json() == {"detail": "trade not found"}
def test_update_trade_friendly_name_success(client_factory: Callable[..., TestClient]) -> None:
client = client_factory(
update_trade_friendly_name_service=MagicMock(
return_value=SimpleNamespace(
id=1,
friendly_name="Updated Trade Name",
),
),
)
with client as c:
r = c.patch(f"{settings.settings.api_base}/trades/friendlyname", json={"id": 1, "friendly_name": "Updated Trade Name"})
assert r.status_code == 200
assert r.json() == {"id": 1, "friendly_name": "Updated Trade Name"}
def test_update_trade_friendly_name_not_found(client_factory: Callable[..., TestClient]) -> None:
client = client_factory(update_trade_friendly_name_service=MagicMock(side_effect=svc.TradeNotFoundError("trade not found")))
with client as c:
r = c.patch(f"{settings.settings.api_base}/trades/friendlyname", json={"id": 999, "friendly_name": "NonExistent Trade"})
assert r.status_code == status.HTTP_404_NOT_FOUND
assert r.json() == {"detail": "trade not found"}
def test_update_trade_note_success(client_factory: Callable[..., TestClient]) -> None:
client = client_factory(
update_trade_note_service=MagicMock(
return_value=SimpleNamespace(
id=1,
note="Updated trade note",
),
),
)
with client as c:
r = c.patch(f"{settings.settings.api_base}/trades/notes", json={"id": 1, "note": "Updated trade note"})
assert r.status_code == 200
assert r.json() == {"id": 1, "note": "Updated trade note"}
def test_update_trade_note_not_found(client_factory: Callable[..., TestClient]) -> None:
client = client_factory(update_trade_note_service=MagicMock(side_effect=svc.TradeNotFoundError("trade not found")))
with client as c:
r = c.patch(f"{settings.settings.api_base}/trades/notes", json={"id": 999, "note": "NonExistent Trade Note"})
assert r.status_code == status.HTTP_404_NOT_FOUND
assert r.json() == {"detail": "trade not found"}

1105
backend/tests/test_crud.py Normal file

File diff suppressed because it is too large Load Diff

89
backend/tests/test_db.py Normal file
View File

@@ -0,0 +1,89 @@
from collections.abc import Generator
from contextlib import contextmanager, suppress
import pytest
from sqlalchemy import text
from sqlmodel import Session, SQLModel
from trading_journal.db import Database, create_database
@contextmanager
def session_ctx(db: Database) -> Generator[Session, None, None]:
"""
Drive Database.get_session() generator and correctly propagate exceptions
into the generator so the generator's except/rollback path runs.
"""
gen = db.get_session()
session = next(gen)
try:
yield session
except Exception as exc:
# Propagate the exception into the dependency generator so it can rollback.
with suppress(StopIteration):
gen.throw(exc)
raise
else:
# Normal completion: advance generator to let it commit/close.
with suppress(StopIteration):
next(gen)
finally:
# close the generator but DO NOT dispose the engine here
gen.close()
@contextmanager
def database_ctx(db: Database) -> Generator[Database, None, None]:
"""
Test-scoped context manager to ensure the Database (engine) is disposed at test end.
Use this to wrap test logic that needs the same in-memory engine across multiple sessions.
"""
try:
yield db
finally:
db.dispose()
def test_select_one_executes() -> None:
db = create_database(None) # in-memory by default
with database_ctx(db), session_ctx(db) as session:
val = session.exec(text("SELECT 1")).scalar_one()
assert int(val) == 1
def test_in_memory_persists_across_sessions_when_using_staticpool() -> None:
db = create_database(None) # in-memory with StaticPool
with database_ctx(db):
with session_ctx(db) as s1:
s1.exec(text("CREATE TABLE IF NOT EXISTS t (id INTEGER PRIMARY KEY, val TEXT);"))
s1.exec(text("INSERT INTO t (val) VALUES (:v)").bindparams(v="hello"))
with session_ctx(db) as s2:
got = s2.exec(text("SELECT val FROM t")).scalar_one()
assert got == "hello"
def test_sqlite_pragmas_applied() -> None:
db = create_database(None)
with database_ctx(db), session_ctx(db) as session:
# PRAGMA returns integer 1 when foreign_keys ON
fk = session.exec(text("PRAGMA foreign_keys")).scalar_one()
assert int(fk) == 1
def test_rollback_on_exception() -> None:
db = create_database(None)
SQLModel.metadata.clear()
db.init_db()
with database_ctx(db):
# Create table then insert and raise inside the same session to force rollback
with pytest.raises(RuntimeError): # noqa: PT012, SIM117
with session_ctx(db) as s:
s.exec(text("CREATE TABLE IF NOT EXISTS t_rb (id INTEGER PRIMARY KEY, val TEXT);"))
s.exec(text("INSERT INTO t_rb (val) VALUES (:v)").bindparams(v="will_rollback"))
# simulate handler error -> should trigger rollback in get_session
raise RuntimeError("simulated failure")
# New session should not see the inserted row
with session_ctx(db) as s2:
rows = list(s2.exec(text("SELECT val FROM t_rb")).scalars())
assert rows == []

View File

@@ -0,0 +1,210 @@
import pytest
from sqlalchemy import text
from sqlalchemy.pool import StaticPool
from sqlmodel import SQLModel, create_engine
from trading_journal import db_migration
def _base_type_of(compiled: str) -> str:
"""Return base type name (e.g. VARCHAR from VARCHAR(13)), upper-cased."""
return compiled.split("(")[0].strip().upper()
def test_run_migrations_0_to_1(monkeypatch: pytest.MonkeyPatch) -> None:
# in-memory engine that preserves the same connection (StaticPool)
SQLModel.metadata.clear()
engine = create_engine(
"sqlite:///:memory:",
connect_args={"check_same_thread": False},
poolclass=StaticPool,
)
try:
monkeypatch.setattr(db_migration, "LATEST_VERSION", 1)
final_version = db_migration.run_migrations(engine)
assert final_version == 1
expected_schema = {
"users": {
"id": ("INTEGER", 1, 1),
"username": ("TEXT", 1, 0),
"password_hash": ("TEXT", 1, 0),
"is_active": ("BOOLEAN", 1, 0),
},
"cycles": {
"id": ("INTEGER", 1, 1),
"user_id": ("INTEGER", 1, 0),
"friendly_name": ("TEXT", 0, 0),
"symbol": ("TEXT", 1, 0),
"exchange_id": ("INTEGER", 1, 0),
"underlying_currency": ("TEXT", 1, 0),
"status": ("TEXT", 1, 0),
"funding_source": ("TEXT", 0, 0),
"capital_exposure_cents": ("INTEGER", 0, 0),
"loan_amount_cents": ("INTEGER", 0, 0),
"loan_interest_rate_tenth_bps": ("INTEGER", 0, 0),
"start_date": ("DATE", 1, 0),
"end_date": ("DATE", 0, 0),
"latest_interest_accrued_date": ("DATE", 0, 0),
"total_accrued_amount_cents": ("INTEGER", 1, 0),
},
"cycle_loan_change_events": {
"id": ("INTEGER", 1, 1),
"cycle_id": ("INTEGER", 1, 0),
"effective_date": ("DATE", 1, 0),
"loan_amount_cents": ("INTEGER", 0, 0),
"loan_interest_rate_tenth_bps": ("INTEGER", 0, 0),
"related_trade_id": ("INTEGER", 0, 0),
"notes": ("TEXT", 0, 0),
"created_at": ("DATETIME", 1, 0),
},
"cycle_daily_accrual": {
"id": ("INTEGER", 1, 1),
"cycle_id": ("INTEGER", 1, 0),
"accrual_date": ("DATE", 1, 0),
"accrual_amount_cents": ("INTEGER", 1, 0),
"created_at": ("DATETIME", 1, 0),
},
"trades": {
"id": ("INTEGER", 1, 1),
"user_id": ("INTEGER", 1, 0),
"friendly_name": ("TEXT", 0, 0),
"symbol": ("TEXT", 1, 0),
"exchange_id": ("INTEGER", 1, 0),
"underlying_currency": ("TEXT", 1, 0),
"trade_type": ("TEXT", 1, 0),
"trade_strategy": ("TEXT", 1, 0),
"trade_date": ("DATE", 1, 0),
"trade_time_utc": ("DATETIME", 1, 0),
"expiry_date": ("DATE", 0, 0),
"strike_price_cents": ("INTEGER", 0, 0),
"quantity": ("INTEGER", 1, 0),
"quantity_multiplier": ("INTEGER", 1, 0),
"price_cents": ("INTEGER", 1, 0),
"gross_cash_flow_cents": ("INTEGER", 1, 0),
"commission_cents": ("INTEGER", 1, 0),
"net_cash_flow_cents": ("INTEGER", 1, 0),
"is_invalidated": ("BOOLEAN", 1, 0),
"invalidated_at": ("DATETIME", 0, 0),
"replaced_by_trade_id": ("INTEGER", 0, 0),
"notes": ("TEXT", 0, 0),
"cycle_id": ("INTEGER", 0, 0),
},
"exchanges": {
"id": ("INTEGER", 1, 1),
"user_id": ("INTEGER", 1, 0),
"name": ("TEXT", 1, 0),
"notes": ("TEXT", 0, 0),
},
"sessions": {
"id": ("INTEGER", 1, 1),
"user_id": ("INTEGER", 1, 0),
"session_token_hash": ("TEXT", 1, 0),
"created_at": ("DATETIME", 1, 0),
"expires_at": ("DATETIME", 1, 0),
"last_seen_at": ("DATETIME", 0, 0),
"last_used_ip": ("TEXT", 0, 0),
"user_agent": ("TEXT", 0, 0),
"device_name": ("TEXT", 0, 0),
},
}
expected_fks = {
"trades": [
{"table": "cycles", "from": "cycle_id", "to": "id"},
{"table": "users", "from": "user_id", "to": "id"},
{"table": "exchanges", "from": "exchange_id", "to": "id"},
],
"cycles": [
{"table": "users", "from": "user_id", "to": "id"},
{"table": "exchanges", "from": "exchange_id", "to": "id"},
],
"cycle_loan_change_events": [
{"table": "cycles", "from": "cycle_id", "to": "id"},
{"table": "trades", "from": "related_trade_id", "to": "id"},
],
"cycle_daily_accrual": [
{"table": "cycles", "from": "cycle_id", "to": "id"},
],
"sessions": [
{"table": "users", "from": "user_id", "to": "id"},
],
"users": [],
"exchanges": [
{"table": "users", "from": "user_id", "to": "id"},
],
}
with engine.connect() as conn:
# check tables exist
rows = conn.execute(
text("SELECT name FROM sqlite_master WHERE type='table'"),
).fetchall()
found_tables = {r[0] for r in rows}
assert set(expected_schema.keys()).issubset(found_tables), f"missing tables: {set(expected_schema.keys()) - found_tables}"
# check user_version
uv = conn.execute(text("PRAGMA user_version")).fetchone()
assert uv is not None
assert int(uv[0]) == 1
# validate each table columns
for tbl_name, cols in expected_schema.items():
info_rows = conn.execute(text(f"PRAGMA table_info({tbl_name})")).fetchall()
# map: name -> (type, notnull, pk)
actual = {r[1]: ((r[2] or "").upper(), int(r[3]), int(r[5])) for r in info_rows}
for colname, (exp_type, exp_notnull, exp_pk) in cols.items():
assert colname in actual, f"{tbl_name}: missing column {colname}"
act_type, act_notnull, act_pk = actual[colname]
# compare base type (e.g. VARCHAR(13) -> VARCHAR)
if act_type:
act_base = _base_type_of(act_type)
else:
act_base = ""
assert exp_type in act_base or act_base in exp_type, (
f"type mismatch {tbl_name}.{colname}: expected {exp_type}, got {act_base}"
)
assert act_notnull == exp_notnull, f"notnull mismatch {tbl_name}.{colname}: expected {exp_notnull}, got {act_notnull}"
assert act_pk == exp_pk, f"pk mismatch {tbl_name}.{colname}: expected {exp_pk}, got {act_pk}"
for tbl_name, fks in expected_fks.items():
fk_rows = conn.execute(text(f"PRAGMA foreign_key_list('{tbl_name}')")).fetchall()
# fk_rows columns: (id, seq, table, from, to, on_update, on_delete, match)
actual_fk_list = [{"table": r[2], "from": r[3], "to": r[4]} for r in fk_rows]
for efk in fks:
assert efk in actual_fk_list, f"missing FK on {tbl_name}: {efk}"
# check trades.replaced_by_trade_id self-referential FK
fk_rows = conn.execute(text("PRAGMA foreign_key_list('trades')")).fetchall()
actual_fk_list = [{"table": r[2], "from": r[3], "to": r[4]} for r in fk_rows]
assert {"table": "trades", "from": "replaced_by_trade_id", "to": "id"} in actual_fk_list, (
"missing self FK trades.replaced_by_trade_id -> trades.id"
)
# helper to find unique index on a column
def has_unique_index(table: str, column: str) -> bool:
idx_rows = conn.execute(text(f"PRAGMA index_list('{table}')")).fetchall()
for idx in idx_rows:
idx_name = idx[1]
is_unique = bool(idx[2])
if not is_unique:
continue
info = conn.execute(text(f"PRAGMA index_info('{idx_name}')")).fetchall()
cols = [r[2] for r in info]
if column in cols:
return True
return False
assert has_unique_index("trades", "friendly_name"), (
"expected unique index on trades(friendly_name) per uq_trades_user_friendly_name"
)
assert has_unique_index("cycles", "friendly_name"), (
"expected unique index on cycles(friendly_name) per uq_cycles_user_friendly_name"
)
assert has_unique_index("exchanges", "name"), "expected unique index on exchanges(name) per uq_exchanges_user_name"
assert has_unique_index("sessions", "session_token_hash"), "expected unique index on sessions(session_token_hash)"
assert has_unique_index("cycle_loan_change_events", "related_trade_id"), (
"expected unique index on cycle_loan_change_events(related_trade_id)"
)
finally:
engine.dispose()
SQLModel.metadata.clear()

View File

@@ -1,22 +0,0 @@
import pytest
from fastapi.testclient import TestClient
from app import app
@pytest.fixture
def client():
with TestClient(app) as client:
yield client
def test_home_route(client):
response = client.get("/")
assert response.status_code == 200
assert response.json() == {"message": "Hello"}
def test_about_route(client):
response = client.get("/about")
assert response.status_code == 200
assert response.json() == {"message": "This is the about page."}

View File

@@ -0,0 +1,24 @@
from trading_journal import security
def test_hash_and_verify_password() -> None:
plain = "password"
hashed = security.hash_password(plain)
assert hashed != plain
assert security.verify_password(plain, hashed)
def test_generate_session_token() -> None:
token1 = security.generate_session_token()
token2 = security.generate_session_token()
assert token1 != token2
assert len(token1) > 0
assert len(token2) > 0
def test_hash_and_verify_session_token_sha256() -> None:
token = security.generate_session_token()
token_hash = security.hash_session_token_sha256(token)
assert token_hash != token
assert security.verify_token_sha256(token, token_hash)
assert not security.verify_token_sha256(token + "x", token_hash)

File diff suppressed because it is too large Load Diff

View File

@@ -12,7 +12,7 @@ def test_default_settings(monkeypatch: pytest.MonkeyPatch) -> None:
s = load_settings()
assert s.host == "0.0.0.0" # noqa: S104
assert s.port == 8000 # noqa: PLR2004
assert s.port == 8000
assert s.workers == 1
assert s.log_level == "info"
@@ -26,8 +26,8 @@ def test_env_overrides(monkeypatch: pytest.MonkeyPatch) -> None:
s = load_settings()
assert s.host == "127.0.0.1"
assert s.port == 9000 # noqa: PLR2004
assert s.workers == 3 # noqa: PLR2004
assert s.port == 9000
assert s.workers == 3
assert s.log_level == "debug"
@@ -40,6 +40,6 @@ def test_yaml_config_file(tmp_path: Path, monkeypatch: pytest.MonkeyPatch) -> No
s = load_settings()
assert s.host == "10.0.0.5"
assert s.port == 8088 # noqa: PLR2004
assert s.workers == 5 # noqa: PLR2004
assert s.port == 8088
assert s.workers == 5
assert s.log_level == "debug"

View File

View File

@@ -0,0 +1,623 @@
from __future__ import annotations
from datetime import date, datetime, timedelta, timezone
from typing import TYPE_CHECKING, Any, TypeVar, cast
from pydantic import BaseModel
from sqlalchemy.exc import IntegrityError
from sqlmodel import Session, select
from trading_journal import models
if TYPE_CHECKING:
from collections.abc import Mapping
from enum import Enum
from sqlalchemy.sql.elements import ColumnElement
# Generic enum member type
T = TypeVar("T", bound="Enum")
def _check_enum(enum_cls: type[T], value: object, field_name: str) -> T:
if value is None:
raise ValueError(f"{field_name} is required")
# already an enum member
if isinstance(value, enum_cls):
return value
# strict string match: must match exactly enum name or enum value (case-sensitive)
if isinstance(value, str):
for m in enum_cls:
if m.name == value or str(m.value) == value:
return m
allowed = [m.name for m in enum_cls]
raise ValueError(f"Invalid {field_name!s}: {value!r}. Allowed: {allowed}")
def _allowed_columns(model: type[models.SQLModel]) -> set[str]:
tbl = cast("models.SQLModel", model).__table__ # type: ignore[attr-defined]
return {c.name for c in tbl.columns}
AnyModel = Any
def _data_to_dict(data: AnyModel) -> dict[str, AnyModel]:
if isinstance(data, BaseModel):
return data.model_dump(exclude_unset=True)
if hasattr(data, "dict"):
return data.dict(exclude_unset=True)
return dict(data)
# Trades
def create_trade(session: Session, trade_data: Mapping[str, Any] | BaseModel) -> models.Trades:
data = _data_to_dict(trade_data)
allowed = _allowed_columns(models.Trades)
payload = {k: v for k, v in data.items() if k in allowed}
cycle_id = payload.get("cycle_id")
if "symbol" not in payload:
raise ValueError("symbol is required")
if "exchange_id" not in payload and cycle_id is None:
raise ValueError("exchange_id is required when no cycle is attached")
# If an exchange_id is provided (and no cycle is attached), ensure the exchange exists
# and belongs to the same user as the trade (if user_id is provided).
if cycle_id is None and "exchange_id" in payload:
ex = session.get(models.Exchanges, payload["exchange_id"])
if ex is None:
raise ValueError("exchange_id does not exist")
user_id = payload.get("user_id")
if user_id is not None and ex.user_id != user_id:
raise ValueError("exchange.user_id does not match trade.user_id")
if "underlying_currency" not in payload:
raise ValueError("underlying_currency is required")
payload["underlying_currency"] = _check_enum(models.UnderlyingCurrency, payload["underlying_currency"], "underlying_currency")
if "trade_type" not in payload:
raise ValueError("trade_type is required")
payload["trade_type"] = _check_enum(models.TradeType, payload["trade_type"], "trade_type")
if "trade_strategy" not in payload:
raise ValueError("trade_strategy is required")
payload["trade_strategy"] = _check_enum(models.TradeStrategy, payload["trade_strategy"], "trade_strategy")
# trade_time_utc is the creation moment: always set to now (caller shouldn't provide)
now = datetime.now(timezone.utc)
payload.pop("trade_time_utc", None)
payload["trade_time_utc"] = now
if "trade_date" not in payload or payload.get("trade_date") is None:
payload["trade_date"] = payload["trade_time_utc"].date()
user_id = payload.get("user_id")
if "quantity" not in payload:
raise ValueError("quantity is required")
if "price_cents" not in payload:
raise ValueError("price_cents is required")
if "commission_cents" not in payload:
payload["commission_cents"] = 0
if "gross_cash_flow_cents" not in payload:
raise ValueError("gross_cash_flow_cents is required")
if "net_cash_flow_cents" not in payload:
raise ValueError("net_cash_flow_cents is required")
# If no cycle_id provided, create Cycle instance but don't call create_cycle()
created_cycle = None
if cycle_id is None:
c_payload = {
"user_id": user_id,
"symbol": payload["symbol"],
"exchange_id": payload["exchange_id"],
"underlying_currency": payload["underlying_currency"],
"friendly_name": "Auto-created Cycle by trade " + payload.get("friendly_name", ""),
"status": models.CycleStatus.OPEN,
"start_date": payload["trade_date"],
}
created_cycle = models.Cycles(**c_payload)
session.add(created_cycle)
# do NOT flush here; will flush together with trade below
# If cycle_id provided, validate existence and ownership
if cycle_id is not None:
cycle = session.get(models.Cycles, cycle_id)
if cycle is None:
raise ValueError("cycle_id does not exist")
payload.pop("exchange_id", None) # ignore exchange_id if provided; use cycle's exchange_id
payload["exchange_id"] = cycle.exchange_id
if cycle.user_id != user_id:
raise ValueError("cycle.user_id does not match trade.user_id")
# Build trade instance; if we created a Cycle instance, link via relationship so a single flush will persist both and populate ids
t_payload = dict(payload)
# remove cycle_id if we're using created_cycle; relationship will set it on flush
if created_cycle is not None:
t_payload.pop("cycle_id", None)
t = models.Trades(**t_payload)
if created_cycle is not None:
t.cycle = created_cycle
session.add(t)
try:
session.flush()
except IntegrityError as e:
session.rollback()
raise ValueError("create_trade integrity error") from e
session.refresh(t)
return t
def get_trade_by_id(session: Session, trade_id: int) -> models.Trades | None:
return session.get(models.Trades, trade_id)
def get_trade_by_user_id_and_friendly_name(session: Session, user_id: int, friendly_name: str) -> models.Trades | None:
statement = select(models.Trades).where(
models.Trades.user_id == user_id,
models.Trades.friendly_name == friendly_name,
)
return session.exec(statement).first()
def get_trades_by_user_id(session: Session, user_id: int) -> list[models.Trades]:
statement = select(models.Trades).where(
models.Trades.user_id == user_id,
)
return list(session.exec(statement).all())
def update_trade_friendly_name(session: Session, trade_id: int, friendly_name: str) -> models.Trades:
trade: models.Trades | None = session.get(models.Trades, trade_id)
if trade is None:
raise ValueError("trade_id does not exist")
trade.friendly_name = friendly_name
session.add(trade)
try:
session.flush()
except IntegrityError as e:
session.rollback()
raise ValueError("update_trade_friendly_name integrity error") from e
session.refresh(trade)
return trade
def update_trade_note(session: Session, trade_id: int, note: str) -> models.Trades:
trade: models.Trades | None = session.get(models.Trades, trade_id)
if trade is None:
raise ValueError("trade_id does not exist")
trade.notes = note
session.add(trade)
try:
session.flush()
except IntegrityError as e:
session.rollback()
raise ValueError("update_trade_note integrity error") from e
session.refresh(trade)
return trade
def invalidate_trade(session: Session, trade_id: int) -> models.Trades:
trade: models.Trades | None = session.get(models.Trades, trade_id)
if trade is None:
raise ValueError("trade_id does not exist")
if trade.is_invalidated:
raise ValueError("trade is already invalidated")
trade.is_invalidated = True
trade.invalidated_at = datetime.now(timezone.utc)
session.add(trade)
try:
session.flush()
except IntegrityError as e:
session.rollback()
raise ValueError("invalidate_trade integrity error") from e
session.refresh(trade)
return trade
def replace_trade(session: Session, old_trade_id: int, new_trade_data: Mapping[str, Any] | BaseModel) -> models.Trades:
invalidate_trade(session, old_trade_id)
data = _data_to_dict(new_trade_data)
data["replaced_by_trade_id"] = old_trade_id
return create_trade(session, data)
# Cycles
def create_cycle(session: Session, cycle_data: Mapping[str, Any] | BaseModel) -> models.Cycles:
data = _data_to_dict(cycle_data)
allowed = _allowed_columns(models.Cycles)
payload = {k: v for k, v in data.items() if k in allowed}
if "user_id" not in payload:
raise ValueError("user_id is required")
if "symbol" not in payload:
raise ValueError("symbol is required")
if "exchange_id" not in payload:
raise ValueError("exchange_id is required")
# ensure the exchange exists and belongs to the same user
ex = session.get(models.Exchanges, payload["exchange_id"])
if ex is None:
raise ValueError("exchange_id does not exist")
if ex.user_id != payload.get("user_id"):
raise ValueError("exchange.user_id does not match cycle.user_id")
if "underlying_currency" not in payload:
raise ValueError("underlying_currency is required")
payload["underlying_currency"] = _check_enum(models.UnderlyingCurrency, payload["underlying_currency"], "underlying_currency")
if "status" not in payload:
raise ValueError("status is required")
payload["status"] = _check_enum(models.CycleStatus, payload["status"], "status")
if "start_date" not in payload:
raise ValueError("start_date is required")
c = models.Cycles(**payload)
session.add(c)
try:
session.flush()
except IntegrityError as e:
session.rollback()
raise ValueError("create_cycle integrity error") from e
session.refresh(c)
return c
IMMUTABLE_CYCLE_FIELDS = {"id", "user_id", "start_date"}
def get_cycle_by_id(session: Session, cycle_id: int) -> models.Cycles | None:
return session.get(models.Cycles, cycle_id)
def get_cycles_by_user_id(session: Session, user_id: int) -> list[models.Cycles]:
statement = select(models.Cycles).where(
models.Cycles.user_id == user_id,
)
return list(session.exec(statement).all())
def update_cycle(session: Session, cycle_id: int, update_data: Mapping[str, Any] | BaseModel) -> models.Cycles:
cycle: models.Cycles | None = session.get(models.Cycles, cycle_id)
if cycle is None:
raise ValueError("cycle_id does not exist")
data = _data_to_dict(update_data)
allowed = _allowed_columns(models.Cycles)
for k, v in data.items():
if k in IMMUTABLE_CYCLE_FIELDS:
raise ValueError(f"field {k!r} is immutable")
if k not in allowed:
continue
# If trying to change exchange_id, ensure the new exchange exists and belongs to
# the same user as the cycle.
if k == "exchange_id":
ex = session.get(models.Exchanges, v)
if ex is None:
raise ValueError("exchange_id does not exist")
if ex.user_id != cycle.user_id:
raise ValueError("exchange.user_id does not match cycle.user_id")
if k == "underlying_currency":
v = _check_enum(models.UnderlyingCurrency, v, "underlying_currency") # noqa: PLW2901
if k == "status":
v = _check_enum(models.CycleStatus, v, "status") # noqa: PLW2901
setattr(cycle, k, v)
session.add(cycle)
try:
session.flush()
except IntegrityError as e:
session.rollback()
raise ValueError("update_cycle integrity error") from e
session.refresh(cycle)
return cycle
# Cycle loan and interest
def create_cycle_loan_event(session: Session, loan_data: Mapping[str, Any] | BaseModel) -> models.CycleLoanChangeEvents:
data = _data_to_dict(loan_data)
allowed = _allowed_columns(models.CycleLoanChangeEvents)
payload = {k: v for k, v in data.items() if k in allowed}
if "cycle_id" not in payload:
raise ValueError("cycle_id is required")
cycle = session.get(models.Cycles, payload["cycle_id"])
if cycle is None:
raise ValueError("cycle_id does not exist")
payload["effective_date"] = payload.get("effective_date") or datetime.now(timezone.utc).date()
payload["created_at"] = datetime.now(timezone.utc)
cle = models.CycleLoanChangeEvents(**payload)
session.add(cle)
try:
session.flush()
except IntegrityError as e:
session.rollback()
raise ValueError("create_cycle_loan_event integrity error") from e
session.refresh(cle)
return cle
def get_loan_events_by_cycle_id(session: Session, cycle_id: int) -> list[models.CycleLoanChangeEvents]:
eff_col = cast("ColumnElement", models.CycleLoanChangeEvents.effective_date)
id_col = cast("ColumnElement", models.CycleLoanChangeEvents.id)
statement = (
select(models.CycleLoanChangeEvents)
.where(
models.CycleLoanChangeEvents.cycle_id == cycle_id,
)
.order_by(eff_col, id_col.asc())
)
return list(session.exec(statement).all())
def create_cycle_daily_accrual(session: Session, cycle_id: int, accrual_date: date, accrual_amount_cents: int) -> models.CycleDailyAccrual:
cycle = session.get(models.Cycles, cycle_id)
if cycle is None:
raise ValueError("cycle_id does not exist")
existing = session.exec(
select(models.CycleDailyAccrual).where(
models.CycleDailyAccrual.cycle_id == cycle_id,
models.CycleDailyAccrual.accrual_date == accrual_date,
),
).first()
if existing:
return existing
if accrual_amount_cents < 0:
raise ValueError("accrual_amount_cents must be non-negative")
row = models.CycleDailyAccrual(
cycle_id=cycle_id,
accrual_date=accrual_date,
accrual_amount_cents=accrual_amount_cents,
created_at=datetime.now(timezone.utc),
)
session.add(row)
try:
session.flush()
except IntegrityError as e:
session.rollback()
raise ValueError("create_cycle_daily_accrual integrity error") from e
session.refresh(row)
return row
def get_cycle_daily_accruals_by_cycle_id(session: Session, cycle_id: int) -> list[models.CycleDailyAccrual]:
date_col = cast("ColumnElement", models.CycleDailyAccrual.accrual_date)
statement = (
select(models.CycleDailyAccrual)
.where(
models.CycleDailyAccrual.cycle_id == cycle_id,
)
.order_by(date_col.asc())
)
return list(session.exec(statement).all())
def get_cycle_daily_accrual_by_cycle_id_and_date(session: Session, cycle_id: int, accrual_date: date) -> models.CycleDailyAccrual | None:
statement = select(models.CycleDailyAccrual).where(
models.CycleDailyAccrual.cycle_id == cycle_id,
models.CycleDailyAccrual.accrual_date == accrual_date,
)
return session.exec(statement).first()
# Exchanges
IMMUTABLE_EXCHANGE_FIELDS = {"id"}
def create_exchange(session: Session, exchange_data: Mapping[str, Any] | BaseModel) -> models.Exchanges:
data = _data_to_dict(exchange_data)
allowed = _allowed_columns(models.Exchanges)
payload = {k: v for k, v in data.items() if k in allowed}
if "name" not in payload:
raise ValueError("name is required")
e = models.Exchanges(**payload)
session.add(e)
try:
session.flush()
except IntegrityError as e:
session.rollback()
raise ValueError("create_exchange integrity error") from e
session.refresh(e)
return e
def get_exchange_by_id(session: Session, exchange_id: int) -> models.Exchanges | None:
return session.get(models.Exchanges, exchange_id)
def get_exchange_by_name_and_user_id(session: Session, name: str, user_id: int) -> models.Exchanges | None:
statement = select(models.Exchanges).where(
models.Exchanges.name == name,
models.Exchanges.user_id == user_id,
)
return session.exec(statement).first()
def get_all_exchanges(session: Session) -> list[models.Exchanges]:
statement = select(models.Exchanges)
return list(session.exec(statement).all())
def get_all_exchanges_by_user_id(session: Session, user_id: int) -> list[models.Exchanges]:
statement = select(models.Exchanges).where(
models.Exchanges.user_id == user_id,
)
return list(session.exec(statement).all())
def update_exchange(session: Session, exchange_id: int, update_data: Mapping[str, Any] | BaseModel) -> models.Exchanges:
exchange: models.Exchanges | None = session.get(models.Exchanges, exchange_id)
if exchange is None:
raise ValueError("exchange_id does not exist")
data = _data_to_dict(update_data)
allowed = _allowed_columns(models.Exchanges)
for k, v in data.items():
if k in IMMUTABLE_EXCHANGE_FIELDS:
raise ValueError(f"field {k!r} is immutable")
if k in allowed:
setattr(exchange, k, v)
session.add(exchange)
try:
session.flush()
except IntegrityError as e:
session.rollback()
raise ValueError("update_exchange integrity error") from e
session.refresh(exchange)
return exchange
def delete_exchange(session: Session, exchange_id: int) -> None:
exchange: models.Exchanges | None = session.get(models.Exchanges, exchange_id)
if exchange is None:
return
session.delete(exchange)
try:
session.flush()
except IntegrityError as e:
session.rollback()
raise ValueError("delete_exchange integrity error") from e
# Users
IMMUTABLE_USER_FIELDS = {"id", "username", "created_at"}
def create_user(session: Session, user_data: Mapping[str, Any] | BaseModel) -> models.Users:
data = _data_to_dict(user_data)
allowed = _allowed_columns(models.Users)
payload = {k: v for k, v in data.items() if k in allowed}
if "username" not in payload:
raise ValueError("username is required")
if "password_hash" not in payload:
raise ValueError("password_hash is required")
u = models.Users(**payload)
session.add(u)
try:
session.flush()
except IntegrityError as e:
session.rollback()
raise ValueError("create_user integrity error") from e
session.refresh(u)
return u
def get_user_by_id(session: Session, user_id: int) -> models.Users | None:
return session.get(models.Users, user_id)
def get_user_by_username(session: Session, username: str) -> models.Users | None:
statement = select(models.Users).where(
models.Users.username == username,
)
return session.exec(statement).first()
def update_user(session: Session, user_id: int, update_data: Mapping[str, Any] | BaseModel) -> models.Users:
user: models.Users | None = session.get(models.Users, user_id)
if user is None:
raise ValueError("user_id does not exist")
data = _data_to_dict(update_data)
allowed = _allowed_columns(models.Users)
for k, v in data.items():
if k in IMMUTABLE_USER_FIELDS:
raise ValueError(f"field {k!r} is immutable")
if k in allowed:
setattr(user, k, v)
session.add(user)
try:
session.flush()
except IntegrityError as e:
session.rollback()
raise ValueError("update_user integrity error") from e
session.refresh(user)
return user
# Sessions
def create_login_session(
session: Session,
user_id: int,
session_token_hash: str,
session_length_seconds: int = 86400,
last_used_ip: str | None = None,
user_agent: str | None = None,
device_name: str | None = None,
) -> models.Sessions:
user: models.Users | None = session.get(models.Users, user_id)
if user is None:
raise ValueError("user_id does not exist")
user_id_val = cast("int", user.id)
now = datetime.now(timezone.utc)
expires_at = now + timedelta(seconds=session_length_seconds)
s = models.Sessions(
user_id=user_id_val,
session_token_hash=session_token_hash,
created_at=now,
expires_at=expires_at,
last_seen_at=now,
last_used_ip=last_used_ip,
user_agent=user_agent,
device_name=device_name,
)
session.add(s)
try:
session.flush()
except IntegrityError as e:
session.rollback()
raise ValueError("create_login_session integrity error") from e
session.refresh(s)
return s
def get_login_session_by_token_hash_and_user_id(session: Session, session_token_hash: str, user_id: int) -> models.Sessions | None:
statement = select(models.Sessions).where(
models.Sessions.session_token_hash == session_token_hash,
models.Sessions.user_id == user_id,
models.Sessions.expires_at > datetime.now(timezone.utc),
)
return session.exec(statement).first()
def get_login_session_by_token_hash(session: Session, session_token_hash: str) -> models.Sessions | None:
statement = select(models.Sessions).where(
models.Sessions.session_token_hash == session_token_hash,
models.Sessions.expires_at > datetime.now(timezone.utc),
)
return session.exec(statement).first()
IMMUTABLE_SESSION_FIELDS = {"id", "user_id", "session_token_hash", "created_at"}
def update_login_session(session: Session, session_token_hashed: str, update_session: Mapping[str, Any] | BaseModel) -> models.Sessions | None:
login_session: models.Sessions | None = session.exec(
select(models.Sessions).where(
models.Sessions.session_token_hash == session_token_hashed,
models.Sessions.expires_at > datetime.now(timezone.utc),
),
).first()
if login_session is None:
return None
data = _data_to_dict(update_session)
allowed = _allowed_columns(models.Sessions)
for k, v in data.items():
if k in allowed and k not in IMMUTABLE_SESSION_FIELDS:
setattr(login_session, k, v)
session.add(login_session)
try:
session.flush()
except IntegrityError as e:
session.rollback()
raise ValueError("update_login_session integrity error") from e
session.refresh(login_session)
return login_session
def delete_login_session(session: Session, session_token_hash: str) -> None:
login_session: models.Sessions | None = session.exec(
select(models.Sessions).where(
models.Sessions.session_token_hash == session_token_hash,
),
).first()
if login_session is None:
return
session.delete(login_session)
try:
session.flush()
except IntegrityError as e:
session.rollback()
raise ValueError("delete_login_session integrity error") from e

View File

@@ -0,0 +1,95 @@
from __future__ import annotations
import logging
from contextlib import contextmanager
from typing import TYPE_CHECKING
from sqlalchemy import event
from sqlalchemy.pool import StaticPool
from sqlmodel import Session, create_engine
if TYPE_CHECKING:
from collections.abc import Generator
from sqlite3 import Connection as DBAPIConnection
class Database:
def __init__(
self,
database_url: str | None = None,
*,
echo: bool = False,
connect_args: dict | None = None,
) -> None:
self._database_url = database_url or "sqlite:///:memory:"
default_connect = {"check_same_thread": False, "timeout": 30} if self._database_url.startswith("sqlite") else {}
merged_connect = {**default_connect, **(connect_args or {})}
if self._database_url == "sqlite:///:memory:":
logger = logging.getLogger(__name__)
logger.warning(
"Using in-memory SQLite database; all data will be lost when the application stops.",
)
self._engine = create_engine(
self._database_url,
echo=echo,
connect_args=merged_connect,
poolclass=StaticPool,
)
else:
self._engine = create_engine(self._database_url, echo=echo, connect_args=merged_connect)
if self._database_url.startswith("sqlite"):
def _enable_sqlite_pragmas(dbapi_conn: DBAPIConnection, _connection_record: object) -> None:
try:
cur = dbapi_conn.cursor()
cur.execute("PRAGMA journal_mode=WAL;")
cur.execute("PRAGMA synchronous=NORMAL;")
cur.execute("PRAGMA foreign_keys=ON;")
cur.execute("PRAGMA busy_timeout=30000;")
cur.close()
except Exception:
logger = logging.getLogger(__name__)
logger.exception("Failed to set sqlite pragmas on new connection: ")
event.listen(self._engine, "connect", _enable_sqlite_pragmas)
def init_db(self) -> None:
pass
def get_session(self) -> Generator[Session, None, None]:
session = Session(self._engine)
try:
yield session
session.commit()
except Exception:
session.rollback()
raise
finally:
session.close()
@contextmanager
def get_session_ctx_manager(self) -> Generator[Session, None, None]:
session = Session(self._engine)
try:
yield session
session.commit()
except Exception:
session.rollback()
raise
finally:
session.close()
def dispose(self) -> None:
self._engine.dispose()
def create_database(
database_url: str | None = None,
*,
echo: bool = False,
connect_args: dict | None = None,
) -> Database:
return Database(database_url, echo=echo, connect_args=connect_args)

View File

@@ -0,0 +1,73 @@
from __future__ import annotations
from typing import TYPE_CHECKING, Callable
from sqlalchemy import text
from sqlmodel import SQLModel
if TYPE_CHECKING:
from sqlalchemy.engine import Connection, Engine
LATEST_VERSION = 1
def _mig_0_1(engine: Engine) -> None:
"""
Initial schema: create all tables from SQLModel models.
Safe to call on an empty DB; idempotent for missing tables.
"""
# Ensure all models are imported before this is called (import side-effect registers tables)
# e.g. trading_journal.models is imported in the caller / app startup.
from trading_journal import models_v1
SQLModel.metadata.create_all(
bind=engine,
tables=[
models_v1.Trades.__table__, # type: ignore[attr-defined]
models_v1.Cycles.__table__, # type: ignore[attr-defined]
models_v1.Users.__table__, # type: ignore[attr-defined]
models_v1.Sessions.__table__, # type: ignore[attr-defined]
models_v1.Exchanges.__table__, # type: ignore[attr-defined]
models_v1.CycleLoanChangeEvents.__table__, # type: ignore[attr-defined]
models_v1.CycleDailyAccrual.__table__, # type: ignore[attr-defined]
],
)
# map current_version -> function that migrates from current_version -> current_version+1
MIGRATIONS: dict[int, Callable[[Engine], None]] = {
0: _mig_0_1,
}
def _get_sqlite_user_version(conn: Connection) -> int:
row = conn.execute(text("PRAGMA user_version")).fetchone()
return int(row[0]) if row and row[0] is not None else 0
def _set_sqlite_user_version(conn: Connection, v: int) -> None:
conn.execute(text(f"PRAGMA user_version = {int(v)}"))
def run_migrations(engine: Engine, target_version: int | None = None) -> int:
"""
Run migrations up to target_version (or LATEST_VERSION).
Returns final applied version.
"""
target = target_version or LATEST_VERSION
with engine.begin() as conn:
driver = conn.engine.name.lower()
if driver == "sqlite":
cur_version = _get_sqlite_user_version(conn)
while cur_version < target:
fn = MIGRATIONS.get(cur_version)
if fn is None:
raise RuntimeError(
f"No migration from {cur_version} -> {cur_version + 1}",
)
# call migration with Engine (fn should use transactions)
fn(engine)
_set_sqlite_user_version(conn, cur_version + 1)
cur_version += 1
return cur_version
return -1 # unknown / unsupported driver; no-op

View File

@@ -0,0 +1,136 @@
from __future__ import annotations
from datetime import date, datetime # noqa: TC003
from pydantic import BaseModel
from sqlmodel import SQLModel
from trading_journal.models import TradeStrategy, TradeType, UnderlyingCurrency # noqa: TC001
class UserBase(SQLModel):
username: str
is_active: bool = True
class UserCreate(UserBase):
password: str
class UserLogin(BaseModel):
username: str
password: str
class UserRead(UserBase):
id: int
class SessionsBase(SQLModel):
user_id: int
class SessionRead(SessionsBase):
id: int
expires_at: datetime
last_seen_at: datetime | None
last_used_ip: str | None
user_agent: str | None
class SessionsCreate(SessionsBase):
expires_at: datetime
class SessionsUpdate(SQLModel):
expires_at: datetime | None = None
last_seen_at: datetime | None = None
last_used_ip: str | None = None
user_agent: str | None = None
class ExchangesBase(SQLModel):
name: str
notes: str | None = None
class ExchangesCreate(ExchangesBase):
user_id: int
class ExchangesRead(ExchangesBase):
id: int
class CycleBase(SQLModel):
friendly_name: str | None = None
status: str
end_date: date | None = None
funding_source: str | None = None
capital_exposure_cents: int | None = None
loan_amount_cents: int | None = None
loan_interest_rate_tenth_bps: int | None = None
trades: list[TradeRead] | None = None
exchange: ExchangesRead | None = None
class CycleCreate(CycleBase):
user_id: int
symbol: str
exchange_id: int
underlying_currency: UnderlyingCurrency
start_date: date
class CycleUpdate(CycleBase):
id: int
class CycleRead(CycleCreate):
id: int
class TradeBase(SQLModel):
friendly_name: str | None = None
symbol: str
exchange_id: int
underlying_currency: UnderlyingCurrency
trade_type: TradeType
trade_strategy: TradeStrategy
trade_date: date
quantity: int
price_cents: int
commission_cents: int
notes: str | None = None
cycle_id: int | None = None
class TradeCreate(TradeBase):
user_id: int | None = None
trade_time_utc: datetime | None = None
gross_cash_flow_cents: int | None = None
net_cash_flow_cents: int | None = None
quantity_multiplier: int = 1
expiry_date: date | None = None
strike_price_cents: int | None = None
is_invalidated: bool = False
invalidated_at: datetime | None = None
replaced_by_trade_id: int | None = None
class TradeNoteUpdate(BaseModel):
id: int
notes: str | None = None
class TradeFriendlyNameUpdate(BaseModel):
id: int
friendly_name: str
class TradeRead(TradeCreate):
id: int
SessionsCreate.model_rebuild()
CycleBase.model_rebuild()

View File

@@ -0,0 +1,199 @@
from datetime import date, datetime
from enum import Enum
from typing import Optional
from sqlmodel import (
Column,
Date,
DateTime,
Field,
ForeignKey,
Integer,
Relationship,
SQLModel,
Text,
UniqueConstraint,
)
class TradeType(str, Enum):
SELL_PUT = "SELL_PUT"
CLOSE_SELL_PUT = "CLOSE_SELL_PUT"
ASSIGNMENT = "ASSIGNMENT"
SELL_CALL = "SELL_CALL"
CLOSE_SELL_CALL = "CLOSE_SELL_CALL"
EXERCISE_CALL = "EXERCISE_CALL"
LONG_SPOT = "LONG_SPOT"
CLOSE_LONG_SPOT = "CLOSE_LONG_SPOT"
SHORT_SPOT = "SHORT_SPOT"
CLOSE_SHORT_SPOT = "CLOSE_SHORT_SPOT"
LONG_CFD = "LONG_CFD"
CLOSE_LONG_CFD = "CLOSE_LONG_CFD"
SHORT_CFD = "SHORT_CFD"
CLOSE_SHORT_CFD = "CLOSE_SHORT_CFD"
LONG_OTHER = "LONG_OTHER"
CLOSE_LONG_OTHER = "CLOSE_LONG_OTHER"
SHORT_OTHER = "SHORT_OTHER"
CLOSE_SHORT_OTHER = "CLOSE_SHORT_OTHER"
class TradeStrategy(str, Enum):
WHEEL = "WHEEL"
FX = "FX"
SPOT = "SPOT"
OTHER = "OTHER"
class CycleStatus(str, Enum):
OPEN = "OPEN"
CLOSED = "CLOSED"
class UnderlyingCurrency(str, Enum):
EUR = "EUR"
USD = "USD"
GBP = "GBP"
JPY = "JPY"
AUD = "AUD"
CAD = "CAD"
CHF = "CHF"
NZD = "NZD"
CNY = "CNY"
class FundingSource(str, Enum):
CASH = "CASH"
MARGIN = "MARGIN"
MIXED = "MIXED"
class Trades(SQLModel, table=True):
__tablename__ = "trades" # type: ignore[attr-defined]
__table_args__ = (UniqueConstraint("user_id", "friendly_name", name="uq_trades_user_friendly_name"),)
id: int | None = Field(default=None, primary_key=True)
user_id: int = Field(foreign_key="users.id", nullable=False, index=True)
# allow null while user may omit friendly_name; uniqueness enforced per-user by constraint
friendly_name: str | None = Field(default=None, sa_column=Column(Text, nullable=True))
symbol: str = Field(sa_column=Column(Text, nullable=False))
exchange_id: int = Field(foreign_key="exchanges.id", nullable=False, index=True)
exchange: "Exchanges" = Relationship(back_populates="trades")
underlying_currency: UnderlyingCurrency = Field(sa_column=Column(Text, nullable=False))
trade_type: TradeType = Field(sa_column=Column(Text, nullable=False))
trade_strategy: TradeStrategy = Field(sa_column=Column(Text, nullable=False))
trade_date: date = Field(sa_column=Column(Date, nullable=False))
trade_time_utc: datetime = Field(sa_column=Column(DateTime(timezone=True), nullable=False))
expiry_date: date | None = Field(default=None, nullable=True)
strike_price_cents: int | None = Field(default=None, nullable=True)
quantity: int = Field(sa_column=Column(Integer, nullable=False))
quantity_multiplier: int = Field(sa_column=Column(Integer, nullable=False), default=1)
price_cents: int = Field(sa_column=Column(Integer, nullable=False))
gross_cash_flow_cents: int = Field(sa_column=Column(Integer, nullable=False))
commission_cents: int = Field(sa_column=Column(Integer, nullable=False))
net_cash_flow_cents: int = Field(sa_column=Column(Integer, nullable=False))
is_invalidated: bool = Field(default=False, nullable=False)
invalidated_at: datetime | None = Field(default=None, sa_column=Column(DateTime(timezone=True), nullable=True))
replaced_by_trade_id: int | None = Field(default=None, foreign_key="trades.id", nullable=True)
notes: str | None = Field(default=None, sa_column=Column(Text, nullable=True))
cycle_id: int | None = Field(default=None, foreign_key="cycles.id", nullable=True, index=True)
cycle: "Cycles" = Relationship(back_populates="trades")
related_loan_change_event: Optional["CycleLoanChangeEvents"] = Relationship(
back_populates="trade",
sa_relationship_kwargs={"uselist": False},
)
class Cycles(SQLModel, table=True):
__tablename__ = "cycles" # type: ignore[attr-defined]
__table_args__ = (UniqueConstraint("user_id", "friendly_name", name="uq_cycles_user_friendly_name"),)
id: int | None = Field(default=None, primary_key=True)
user_id: int = Field(foreign_key="users.id", nullable=False, index=True)
friendly_name: str | None = Field(default=None, sa_column=Column(Text, nullable=True))
symbol: str = Field(sa_column=Column(Text, nullable=False))
exchange_id: int = Field(foreign_key="exchanges.id", nullable=False, index=True)
exchange: "Exchanges" = Relationship(back_populates="cycles")
underlying_currency: UnderlyingCurrency = Field(sa_column=Column(Text, nullable=False))
status: CycleStatus = Field(sa_column=Column(Text, nullable=False))
funding_source: FundingSource = Field(sa_column=Column(Text, nullable=True))
capital_exposure_cents: int | None = Field(default=None, nullable=True)
start_date: date = Field(sa_column=Column(Date, nullable=False))
end_date: date | None = Field(default=None, sa_column=Column(Date, nullable=True))
trades: list["Trades"] = Relationship(back_populates="cycle")
loan_amount_cents: int | None = Field(default=None, nullable=True)
loan_interest_rate_tenth_bps: int | None = Field(default=None, nullable=True)
latest_interest_accrued_date: date | None = Field(default=None, sa_column=Column(Date, nullable=True))
total_accrued_amount_cents: int = Field(default=0, sa_column=Column(Integer, nullable=False))
loan_change_events: list["CycleLoanChangeEvents"] = Relationship(back_populates="cycle")
daily_accruals: list["CycleDailyAccrual"] = Relationship(back_populates="cycle")
class CycleLoanChangeEvents(SQLModel, table=True):
__tablename__ = "cycle_loan_change_events" # type: ignore[attr-defined]
id: int | None = Field(default=None, primary_key=True)
cycle_id: int = Field(sa_column=Column(Integer, ForeignKey("cycles.id", ondelete="CASCADE"), nullable=False, index=True))
effective_date: date = Field(sa_column=Column(Date, nullable=False))
loan_amount_cents: int | None = Field(default=None, sa_column=Column(Integer, nullable=True))
loan_interest_rate_tenth_bps: int | None = Field(default=None, sa_column=Column(Integer, nullable=True))
related_trade_id: int | None = Field(default=None, sa_column=Column(Integer, ForeignKey("trades.id"), nullable=True, unique=True))
notes: str | None = Field(default=None, sa_column=Column(Text, nullable=True))
created_at: datetime = Field(sa_column=Column(DateTime(timezone=True), nullable=False))
cycle: "Cycles" = Relationship(back_populates="loan_change_events")
trade: Optional["Trades"] = Relationship(back_populates="related_loan_change_event")
class CycleDailyAccrual(SQLModel, table=True):
__tablename__ = "cycle_daily_accrual" # type: ignore[attr-defined]
__table_args__ = (UniqueConstraint("cycle_id", "accrual_date", name="uq_cycle_daily_accruals_cycle_date"),)
id: int | None = Field(default=None, primary_key=True)
cycle_id: int = Field(sa_column=Column(Integer, ForeignKey("cycles.id", ondelete="CASCADE"), nullable=False, index=True))
accrual_date: date = Field(sa_column=Column(Date, nullable=False))
accrual_amount_cents: int = Field(sa_column=Column(Integer, nullable=False))
created_at: datetime = Field(sa_column=Column(DateTime(timezone=True), nullable=False))
cycle: "Cycles" = Relationship(back_populates="daily_accruals")
class Exchanges(SQLModel, table=True):
__tablename__ = "exchanges" # type: ignore[attr-defined]
__table_args__ = (UniqueConstraint("user_id", "name", name="uq_exchanges_user_name"),)
id: int | None = Field(default=None, primary_key=True)
user_id: int = Field(foreign_key="users.id", nullable=False, index=True)
name: str = Field(sa_column=Column(Text, nullable=False))
notes: str | None = Field(default=None, sa_column=Column(Text, nullable=True))
trades: list["Trades"] = Relationship(back_populates="exchange")
cycles: list["Cycles"] = Relationship(back_populates="exchange")
user: "Users" = Relationship(back_populates="exchanges")
class Users(SQLModel, table=True):
__tablename__ = "users" # type: ignore[attr-defined]
id: int | None = Field(default=None, primary_key=True)
# unique=True already creates an index; no need to also set index=True
username: str = Field(sa_column=Column(Text, nullable=False, unique=True))
password_hash: str = Field(sa_column=Column(Text, nullable=False))
is_active: bool = Field(default=True, nullable=False)
sessions: list["Sessions"] = Relationship(back_populates="user")
exchanges: list["Exchanges"] = Relationship(back_populates="user")
class Sessions(SQLModel, table=True):
__tablename__ = "sessions" # type: ignore[attr-defined]
id: int | None = Field(default=None, primary_key=True)
user_id: int = Field(foreign_key="users.id", nullable=False, index=True)
session_token_hash: str = Field(sa_column=Column(Text, nullable=False, unique=True))
created_at: datetime = Field(sa_column=Column(DateTime(timezone=True), nullable=False))
expires_at: datetime = Field(sa_column=Column(DateTime(timezone=True), nullable=False, index=True))
last_seen_at: datetime | None = Field(sa_column=Column(DateTime(timezone=True), nullable=True))
last_used_ip: str | None = Field(default=None, sa_column=Column(Text, nullable=True))
user_agent: str | None = Field(default=None, sa_column=Column(Text, nullable=True))
device_name: str | None = Field(default=None, sa_column=Column(Text, nullable=True))
user: "Users" = Relationship(back_populates="sessions")

View File

@@ -0,0 +1,199 @@
from datetime import date, datetime
from enum import Enum
from typing import Optional
from sqlmodel import (
Column,
Date,
DateTime,
Field,
ForeignKey,
Integer,
Relationship,
SQLModel,
Text,
UniqueConstraint,
)
class TradeType(str, Enum):
SELL_PUT = "SELL_PUT"
CLOSE_SELL_PUT = "CLOSE_SELL_PUT"
ASSIGNMENT = "ASSIGNMENT"
SELL_CALL = "SELL_CALL"
CLOSE_SELL_CALL = "CLOSE_SELL_CALL"
EXERCISE_CALL = "EXERCISE_CALL"
LONG_SPOT = "LONG_SPOT"
CLOSE_LONG_SPOT = "CLOSE_LONG_SPOT"
SHORT_SPOT = "SHORT_SPOT"
CLOSE_SHORT_SPOT = "CLOSE_SHORT_SPOT"
LONG_CFD = "LONG_CFD"
CLOSE_LONG_CFD = "CLOSE_LONG_CFD"
SHORT_CFD = "SHORT_CFD"
CLOSE_SHORT_CFD = "CLOSE_SHORT_CFD"
LONG_OTHER = "LONG_OTHER"
CLOSE_LONG_OTHER = "CLOSE_LONG_OTHER"
SHORT_OTHER = "SHORT_OTHER"
CLOSE_SHORT_OTHER = "CLOSE_SHORT_OTHER"
class TradeStrategy(str, Enum):
WHEEL = "WHEEL"
FX = "FX"
SPOT = "SPOT"
OTHER = "OTHER"
class CycleStatus(str, Enum):
OPEN = "OPEN"
CLOSED = "CLOSED"
class UnderlyingCurrency(str, Enum):
EUR = "EUR"
USD = "USD"
GBP = "GBP"
JPY = "JPY"
AUD = "AUD"
CAD = "CAD"
CHF = "CHF"
NZD = "NZD"
CNY = "CNY"
class FundingSource(str, Enum):
CASH = "CASH"
MARGIN = "MARGIN"
MIXED = "MIXED"
class Trades(SQLModel, table=True):
__tablename__ = "trades" # type: ignore[attr-defined]
__table_args__ = (UniqueConstraint("user_id", "friendly_name", name="uq_trades_user_friendly_name"),)
id: int | None = Field(default=None, primary_key=True)
user_id: int = Field(foreign_key="users.id", nullable=False, index=True)
# allow null while user may omit friendly_name; uniqueness enforced per-user by constraint
friendly_name: str | None = Field(default=None, sa_column=Column(Text, nullable=True))
symbol: str = Field(sa_column=Column(Text, nullable=False))
exchange_id: int = Field(foreign_key="exchanges.id", nullable=False, index=True)
exchange: "Exchanges" = Relationship(back_populates="trades")
underlying_currency: UnderlyingCurrency = Field(sa_column=Column(Text, nullable=False))
trade_type: TradeType = Field(sa_column=Column(Text, nullable=False))
trade_strategy: TradeStrategy = Field(sa_column=Column(Text, nullable=False))
trade_date: date = Field(sa_column=Column(Date, nullable=False))
trade_time_utc: datetime = Field(sa_column=Column(DateTime(timezone=True), nullable=False))
expiry_date: date | None = Field(default=None, nullable=True)
strike_price_cents: int | None = Field(default=None, nullable=True)
quantity: int = Field(sa_column=Column(Integer, nullable=False))
quantity_multiplier: int = Field(sa_column=Column(Integer, nullable=False), default=1)
price_cents: int = Field(sa_column=Column(Integer, nullable=False))
gross_cash_flow_cents: int = Field(sa_column=Column(Integer, nullable=False))
commission_cents: int = Field(sa_column=Column(Integer, nullable=False))
net_cash_flow_cents: int = Field(sa_column=Column(Integer, nullable=False))
is_invalidated: bool = Field(default=False, nullable=False)
invalidated_at: datetime | None = Field(default=None, sa_column=Column(DateTime(timezone=True), nullable=True))
replaced_by_trade_id: int | None = Field(default=None, foreign_key="trades.id", nullable=True)
notes: str | None = Field(default=None, sa_column=Column(Text, nullable=True))
cycle_id: int | None = Field(default=None, foreign_key="cycles.id", nullable=True, index=True)
cycle: "Cycles" = Relationship(back_populates="trades")
related_loan_change_event: Optional["CycleLoanChangeEvents"] = Relationship(
back_populates="trade",
sa_relationship_kwargs={"uselist": False},
)
class Cycles(SQLModel, table=True):
__tablename__ = "cycles" # type: ignore[attr-defined]
__table_args__ = (UniqueConstraint("user_id", "friendly_name", name="uq_cycles_user_friendly_name"),)
id: int | None = Field(default=None, primary_key=True)
user_id: int = Field(foreign_key="users.id", nullable=False, index=True)
friendly_name: str | None = Field(default=None, sa_column=Column(Text, nullable=True))
symbol: str = Field(sa_column=Column(Text, nullable=False))
exchange_id: int = Field(foreign_key="exchanges.id", nullable=False, index=True)
exchange: "Exchanges" = Relationship(back_populates="cycles")
underlying_currency: UnderlyingCurrency = Field(sa_column=Column(Text, nullable=False))
status: CycleStatus = Field(sa_column=Column(Text, nullable=False))
funding_source: FundingSource = Field(sa_column=Column(Text, nullable=True))
capital_exposure_cents: int | None = Field(default=None, nullable=True)
start_date: date = Field(sa_column=Column(Date, nullable=False))
end_date: date | None = Field(default=None, sa_column=Column(Date, nullable=True))
trades: list["Trades"] = Relationship(back_populates="cycle")
loan_amount_cents: int | None = Field(default=None, nullable=True)
loan_interest_rate_tenth_bps: int | None = Field(default=None, nullable=True)
latest_interest_accrued_date: date | None = Field(default=None, sa_column=Column(Date, nullable=True))
total_accrued_amount_cents: int = Field(default=0, sa_column=Column(Integer, nullable=False))
loan_change_events: list["CycleLoanChangeEvents"] = Relationship(back_populates="cycle")
daily_accruals: list["CycleDailyAccrual"] = Relationship(back_populates="cycle")
class CycleLoanChangeEvents(SQLModel, table=True):
__tablename__ = "cycle_loan_change_events" # type: ignore[attr-defined]
id: int | None = Field(default=None, primary_key=True)
cycle_id: int = Field(sa_column=Column(Integer, ForeignKey("cycles.id", ondelete="CASCADE"), nullable=False, index=True))
effective_date: date = Field(sa_column=Column(Date, nullable=False))
loan_amount_cents: int | None = Field(default=None, sa_column=Column(Integer, nullable=True))
loan_interest_rate_tenth_bps: int | None = Field(default=None, sa_column=Column(Integer, nullable=True))
related_trade_id: int | None = Field(default=None, sa_column=Column(Integer, ForeignKey("trades.id"), nullable=True, unique=True))
notes: str | None = Field(default=None, sa_column=Column(Text, nullable=True))
created_at: datetime = Field(sa_column=Column(DateTime(timezone=True), nullable=False))
cycle: "Cycles" = Relationship(back_populates="loan_change_events")
trade: Optional["Trades"] = Relationship(back_populates="related_loan_change_event")
class CycleDailyAccrual(SQLModel, table=True):
__tablename__ = "cycle_daily_accrual" # type: ignore[attr-defined]
__table_args__ = (UniqueConstraint("cycle_id", "accrual_date", name="uq_cycle_daily_accruals_cycle_date"),)
id: int | None = Field(default=None, primary_key=True)
cycle_id: int = Field(sa_column=Column(Integer, ForeignKey("cycles.id", ondelete="CASCADE"), nullable=False, index=True))
accrual_date: date = Field(sa_column=Column(Date, nullable=False))
accrual_amount_cents: int = Field(sa_column=Column(Integer, nullable=False))
created_at: datetime = Field(sa_column=Column(DateTime(timezone=True), nullable=False))
cycle: "Cycles" = Relationship(back_populates="daily_accruals")
class Exchanges(SQLModel, table=True):
__tablename__ = "exchanges" # type: ignore[attr-defined]
__table_args__ = (UniqueConstraint("user_id", "name", name="uq_exchanges_user_name"),)
id: int | None = Field(default=None, primary_key=True)
user_id: int = Field(foreign_key="users.id", nullable=False, index=True)
name: str = Field(sa_column=Column(Text, nullable=False))
notes: str | None = Field(default=None, sa_column=Column(Text, nullable=True))
trades: list["Trades"] = Relationship(back_populates="exchange")
cycles: list["Cycles"] = Relationship(back_populates="exchange")
user: "Users" = Relationship(back_populates="exchanges")
class Users(SQLModel, table=True):
__tablename__ = "users" # type: ignore[attr-defined]
id: int | None = Field(default=None, primary_key=True)
# unique=True already creates an index; no need to also set index=True
username: str = Field(sa_column=Column(Text, nullable=False, unique=True))
password_hash: str = Field(sa_column=Column(Text, nullable=False))
is_active: bool = Field(default=True, nullable=False)
sessions: list["Sessions"] = Relationship(back_populates="user")
exchanges: list["Exchanges"] = Relationship(back_populates="user")
class Sessions(SQLModel, table=True):
__tablename__ = "sessions" # type: ignore[attr-defined]
id: int | None = Field(default=None, primary_key=True)
user_id: int = Field(foreign_key="users.id", nullable=False, index=True)
session_token_hash: str = Field(sa_column=Column(Text, nullable=False, unique=True))
created_at: datetime = Field(sa_column=Column(DateTime(timezone=True), nullable=False))
expires_at: datetime = Field(sa_column=Column(DateTime(timezone=True), nullable=False, index=True))
last_seen_at: datetime | None = Field(sa_column=Column(DateTime(timezone=True), nullable=True))
last_used_ip: str | None = Field(default=None, sa_column=Column(Text, nullable=True))
user_agent: str | None = Field(default=None, sa_column=Column(Text, nullable=True))
device_name: str | None = Field(default=None, sa_column=Column(Text, nullable=True))
user: "Users" = Relationship(back_populates="sessions")

View File

@@ -0,0 +1,51 @@
import hashlib
import hmac
import secrets
from argon2 import PasswordHasher
from argon2.exceptions import VerifyMismatchError
import settings
ph = PasswordHasher()
# Utility functions for password hashing and verification
def hash_password(plain: str) -> str:
return ph.hash(plain)
def verify_password(plain: str, hashed: str) -> bool:
try:
return ph.verify(hashed, plain)
except VerifyMismatchError:
return False
# Session token hash
def generate_session_token(nbytes: int = 32) -> str:
return secrets.token_urlsafe(nbytes)
def hash_session_token_sha256(token: str) -> str:
return hashlib.sha256(token.encode("utf-8")).hexdigest()
def sign_token_hmac(token: str) -> str:
if not settings.settings.hmac_key:
return token
return hmac.new(settings.settings.hmac_key.encode("utf-8"), token.encode("utf-8"), hashlib.sha256).hexdigest()
def verify_token_sha256(token: str, expected_hash: str) -> bool:
return hmac.compare_digest(hash_session_token_sha256(token), expected_hash)
def verify_token_hmac(token: str, expected_hmac: str) -> bool:
if not settings.settings.hmac_key:
return verify_token_sha256(token, expected_hmac)
sig = hmac.new(settings.settings.hmac_key.encode("utf-8"), token.encode("utf-8"), hashlib.sha256).hexdigest()
return hmac.compare_digest(sig, expected_hmac)

View File

@@ -0,0 +1,364 @@
from __future__ import annotations
import logging
from datetime import datetime, timedelta, timezone
from typing import TYPE_CHECKING, cast
from fastapi import Request, Response, status
from fastapi.responses import JSONResponse
from starlette.middleware.base import BaseHTTPMiddleware, RequestResponseEndpoint
import settings
from trading_journal import crud, security
from trading_journal.dto import (
CycleBase,
CycleCreate,
CycleRead,
CycleUpdate,
ExchangesBase,
ExchangesCreate,
ExchangesRead,
SessionsCreate,
SessionsUpdate,
TradeCreate,
TradeRead,
UserCreate,
UserLogin,
UserRead,
)
if TYPE_CHECKING:
from sqlmodel import Session
from trading_journal.db import Database
from trading_journal.models import Sessions
EXCEPT_PATHS = [
f"{settings.settings.api_base}/status",
f"{settings.settings.api_base}/register",
f"{settings.settings.api_base}/login",
]
logger = logging.getLogger(__name__)
class AuthMiddleWare(BaseHTTPMiddleware):
async def dispatch(self, request: Request, call_next: RequestResponseEndpoint) -> Response: # noqa: PLR0911
if request.url.path in EXCEPT_PATHS:
return await call_next(request)
token = request.cookies.get("session_token")
if not token:
auth_header = request.headers.get("Authorization")
if auth_header and auth_header.startswith("Bearer "):
token = auth_header[len("Bearer ") :]
if not token:
return JSONResponse(
status_code=status.HTTP_401_UNAUTHORIZED,
content={"detail": "Unauthorized"},
)
db_factory: Database | None = getattr(request.app.state, "db_factory", None)
if db_factory is None:
return JSONResponse(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, content={"detail": "db factory not configured"})
try:
with db_factory.get_session_ctx_manager() as request_session:
hashed_token = security.hash_session_token_sha256(token)
request.state.db_session = request_session
login_session: Sessions | None = crud.get_login_session_by_token_hash(request_session, hashed_token)
if not login_session:
return JSONResponse(status_code=status.HTTP_401_UNAUTHORIZED, content={"detail": "Unauthorized"})
session_expires_utc = login_session.expires_at.replace(tzinfo=timezone.utc)
if session_expires_utc < datetime.now(timezone.utc):
crud.delete_login_session(request_session, login_session.session_token_hash)
return JSONResponse(status_code=status.HTTP_401_UNAUTHORIZED, content={"detail": "Unauthorized"})
if login_session.user.is_active is False:
return JSONResponse(status_code=status.HTTP_401_UNAUTHORIZED, content={"detail": "Unauthorized"})
if session_expires_utc - datetime.now(timezone.utc) < timedelta(seconds=3600):
updated_expiry = datetime.now(timezone.utc) + timedelta(seconds=settings.settings.session_expiry_seconds)
else:
updated_expiry = session_expires_utc
updated_session: SessionsUpdate = SessionsUpdate(
last_seen_at=datetime.now(timezone.utc),
last_used_ip=request.client.host if request.client else None,
user_agent=request.headers.get("User-Agent"),
expires_at=updated_expiry,
)
user_id = login_session.user_id
request.state.user_id = user_id
crud.update_login_session(request_session, hashed_token, update_session=updated_session)
except Exception:
logger.exception("Failed to authenticate user: \n")
return JSONResponse(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, content={"detail": "Internal server error"})
return await call_next(request)
class ServiceError(Exception):
pass
class UserAlreadyExistsError(ServiceError):
pass
class ExchangeAlreadyExistsError(ServiceError):
pass
class ExchangeNotFoundError(ServiceError):
pass
class CycleNotFoundError(ServiceError):
pass
class TradeNotFoundError(ServiceError):
pass
class InvalidTradeDataError(ServiceError):
pass
class InvalidCycleDataError(ServiceError):
pass
# User service
def register_user_service(db_session: Session, user_in: UserCreate) -> UserRead:
if crud.get_user_by_username(db_session, user_in.username):
raise UserAlreadyExistsError("username already exists")
hashed = security.hash_password(user_in.password)
user_data: dict = {
"username": user_in.username,
"password_hash": hashed,
}
try:
user = crud.create_user(db_session, user_data=user_data)
try:
# prefer pydantic's from_orm if DTO supports orm_mode
user = UserRead.model_validate(user)
except Exception as e:
logger.exception("Failed to convert user to UserRead: ")
raise ServiceError("Failed to convert user to UserRead") from e
except Exception as e:
logger.exception("Failed to create user:")
raise ServiceError("Failed to create user") from e
return user
def authenticate_user_service(db_session: Session, user_in: UserLogin) -> tuple[SessionsCreate, str] | None:
user = crud.get_user_by_username(db_session, user_in.username)
if not user:
return None
user_id_val = cast("int", user.id)
if not security.verify_password(user_in.password, user.password_hash):
return None
token = security.generate_session_token()
token_hashed = security.hash_session_token_sha256(token)
try:
session = crud.create_login_session(
session=db_session,
user_id=user_id_val,
session_token_hash=token_hashed,
session_length_seconds=settings.settings.session_expiry_seconds,
)
except Exception as e:
logger.exception("Failed to create login session: \n")
raise ServiceError("Failed to create login session") from e
return SessionsCreate.model_validate(session), token
# Exchanges service
def create_exchange_service(db_session: Session, user_id: int, name: str, notes: str | None) -> ExchangesCreate:
existing_exchange = crud.get_exchange_by_name_and_user_id(db_session, name, user_id)
if existing_exchange:
raise ExchangeAlreadyExistsError("Exchange with the same name already exists for this user")
exchange_data = ExchangesCreate(
user_id=user_id,
name=name,
notes=notes,
)
try:
exchange = crud.create_exchange(db_session, exchange_data=exchange_data)
try:
exchange_dto = ExchangesCreate.model_validate(exchange)
except Exception as e:
logger.exception("Failed to convert exchange to ExchangesCreate:")
raise ServiceError("Failed to convert exchange to ExchangesCreate") from e
except Exception as e:
logger.exception("Failed to create exchange:")
raise ServiceError("Failed to create exchange") from e
return exchange_dto
def get_exchanges_by_user_service(db_session: Session, user_id: int) -> list[ExchangesRead]:
exchanges = crud.get_all_exchanges_by_user_id(db_session, user_id)
return [ExchangesRead.model_validate(exchange) for exchange in exchanges]
def update_exchanges_service(db_session: Session, user_id: int, exchange_id: int, name: str | None, notes: str | None) -> ExchangesBase:
existing_exchange = crud.get_exchange_by_id(db_session, exchange_id)
if not existing_exchange:
raise ExchangeNotFoundError("Exchange not found")
if existing_exchange.user_id != user_id:
raise ExchangeNotFoundError("Exchange not found")
if name:
other_exchange = crud.get_exchange_by_name_and_user_id(db_session, name, user_id)
if other_exchange and other_exchange.id != existing_exchange.id:
raise ExchangeAlreadyExistsError("Another exchange with the same name already exists for this user")
exchange_data = ExchangesBase(
name=name or existing_exchange.name,
notes=notes or existing_exchange.notes,
)
try:
exchange = crud.update_exchange(db_session, cast("int", existing_exchange.id), update_data=exchange_data)
except Exception as e:
logger.exception("Failed to update exchange: \n")
raise ServiceError("Failed to update exchange") from e
return ExchangesBase.model_validate(exchange)
# Cycle Service
def create_cycle_service(db_session: Session, user_id: int, cycle_data: CycleBase) -> CycleRead:
raise NotImplementedError("Cycle creation not implemented")
cycle_data_dict = cycle_data.model_dump()
cycle_data_dict["user_id"] = user_id
cycle_data_with_user_id: CycleCreate = CycleCreate.model_validate(cycle_data_dict)
created_cycle = crud.create_cycle(db_session, cycle_data=cycle_data_with_user_id)
return CycleRead.model_validate(created_cycle)
def get_cycle_by_id_service(db_session: Session, user_id: int, cycle_id: int) -> CycleRead:
cycle = crud.get_cycle_by_id(db_session, cycle_id)
if not cycle:
raise CycleNotFoundError("Cycle not found")
if cycle.user_id != user_id:
raise CycleNotFoundError("Cycle not found")
return CycleRead.model_validate(cycle)
def get_cycles_by_user_service(db_session: Session, user_id: int) -> list[CycleRead]:
cycles = crud.get_cycles_by_user_id(db_session, user_id)
return [CycleRead.model_validate(cycle) for cycle in cycles]
def _validate_cycle_update_data(cycle_data: CycleUpdate) -> tuple[bool, str]: # noqa: PLR0911
if cycle_data.status == "CLOSED" and cycle_data.end_date is None:
return False, "end_date is required when status is CLOSED"
if cycle_data.status == "OPEN" and cycle_data.end_date is not None:
return False, "end_date must be empty when status is OPEN"
if cycle_data.capital_exposure_cents is not None and cycle_data.capital_exposure_cents < 0:
return False, "capital_exposure_cents must be non-negative"
if (
cycle_data.funding_source is not None
and cycle_data.funding_source != "CASH"
and (cycle_data.loan_amount_cents is None or cycle_data.loan_interest_rate_tenth_bps is None)
):
return False, "loan_amount_cents and loan_interest_rate_tenth_bps are required when funding_source is not CASH"
if cycle_data.loan_amount_cents is not None and cycle_data.loan_amount_cents < 0:
return False, "loan_amount_cents must be non-negative"
if cycle_data.loan_interest_rate_tenth_bps is not None and cycle_data.loan_interest_rate_tenth_bps < 0:
return False, "loan_interest_rate_tenth_bps must be non-negative"
return True, ""
def update_cycle_service(db_session: Session, user_id: int, cycle_data: CycleUpdate) -> CycleRead:
is_valid, err_msg = _validate_cycle_update_data(cycle_data)
if not is_valid:
raise InvalidCycleDataError(err_msg)
cycle_id = cast("int", cycle_data.id)
existing_cycle = crud.get_cycle_by_id(db_session, cycle_id)
if not existing_cycle:
raise CycleNotFoundError("Cycle not found")
if existing_cycle.user_id != user_id:
raise CycleNotFoundError("Cycle not found")
provided_data_dict = cycle_data.model_dump(exclude_unset=True)
cycle_data_with_user_id: CycleBase = CycleBase.model_validate(provided_data_dict)
try:
updated_cycle = crud.update_cycle(db_session, cycle_id, update_data=cycle_data_with_user_id)
except Exception as e:
logger.exception("Failed to update cycle: \n")
raise ServiceError("Failed to update cycle") from e
return CycleRead.model_validate(updated_cycle)
# Trades service
def _append_cashflows(trade_data: TradeCreate) -> TradeCreate:
sign_multipler: int
if trade_data.trade_type in ("SELL_PUT", "SELL_CALL", "EXERCISE_CALL", "CLOSE_LONG_SPOT", "SHORT_SPOT"):
sign_multipler = 1
else:
sign_multipler = -1
quantity = trade_data.quantity * trade_data.quantity_multiplier
gross_cash_flow_cents = quantity * trade_data.price_cents * sign_multipler
net_cash_flow_cents = gross_cash_flow_cents - trade_data.commission_cents
trade_data.gross_cash_flow_cents = gross_cash_flow_cents
trade_data.net_cash_flow_cents = net_cash_flow_cents
return trade_data
def _validate_trade_data(trade_data: TradeCreate) -> bool:
return not (
trade_data.trade_type in ("SELL_PUT", "SELL_CALL") and (trade_data.expiry_date is None or trade_data.strike_price_cents is None)
)
def create_trade_service(db_session: Session, user_id: int, trade_data: TradeCreate) -> TradeRead:
if not _validate_trade_data(trade_data):
raise InvalidTradeDataError("Invalid trade data: expiry_date and strike_price_cents are required for SELL_PUT and SELL_CALL trades")
trade_data_dict = trade_data.model_dump()
trade_data_dict["user_id"] = user_id
trade_data_with_user_id: TradeCreate = TradeCreate.model_validate(trade_data_dict)
trade_data_with_user_id = _append_cashflows(trade_data_with_user_id)
created_trade = crud.create_trade(db_session, trade_data=trade_data_with_user_id)
return TradeRead.model_validate(created_trade)
def get_trade_by_id_service(db_session: Session, user_id: int, trade_id: int) -> TradeRead:
trade = crud.get_trade_by_id(db_session, trade_id)
if not trade:
raise TradeNotFoundError("Trade not found")
if trade.user_id != user_id:
raise TradeNotFoundError("Trade not found")
return TradeRead.model_validate(trade)
def update_trade_friendly_name_service(db_session: Session, user_id: int, trade_id: int, friendly_name: str) -> TradeRead:
existing_trade = crud.get_trade_by_id(db_session, trade_id)
if not existing_trade:
raise TradeNotFoundError("Trade not found")
if existing_trade.user_id != user_id:
raise TradeNotFoundError("Trade not found")
try:
updated_trade = crud.update_trade_friendly_name(db_session, trade_id, friendly_name)
except Exception as e:
logger.exception("Failed to update trade friendly name: \n")
raise ServiceError("Failed to update trade friendly name") from e
return TradeRead.model_validate(updated_trade)
def update_trade_note_service(db_session: Session, user_id: int, trade_id: int, note: str | None) -> TradeRead:
existing_trade = crud.get_trade_by_id(db_session, trade_id)
if not existing_trade:
raise TradeNotFoundError("Trade not found")
if existing_trade.user_id != user_id:
raise TradeNotFoundError("Trade not found")
if note is None:
note = ""
try:
updated_trade = crud.update_trade_note(db_session, trade_id, note)
except Exception as e:
logger.exception("Failed to update trade notes: \n")
raise ServiceError("Failed to update trade notes") from e
return TradeRead.model_validate(updated_trade)

View File

View File

@@ -0,0 +1,7 @@
from sqlmodel import create_engine
import settings
from trading_journal import db_migration
db_engine = create_engine(settings.settings.database_url, echo=True)
db_migration.run_migrations(db_engine)