33 Commits

Author SHA1 Message Date
tliu93 636bb2b80b Merge pull request 'add get public and storage feature' (#6) from feature/public_ip into main
docker-image / build-and-push (push) Successful in 3m59s
pytest / test (push) Successful in 53s
Reviewed-on: #6
2026-04-29 13:16:58 +02:00
tliu93 eda49489e0 update reademe and docs
pytest / test (push) Successful in 56s
pytest / test (pull_request) Successful in 59s
2026-04-29 13:07:59 +02:00
tliu93 779e160b95 add ip change notification and refine sender display
pytest / test (push) Successful in 57s
pytest / test (pull_request) Successful in 54s
2026-04-29 13:03:12 +02:00
tliu93 3ea3498e58 add smtp module and testing 2026-04-29 12:11:10 +02:00
tliu93 5a420bd37b add get public and storage feature 2026-04-29 11:45:49 +02:00
tliu93 a24e402d47 add grafana provisioning
pytest / test (push) Successful in 46s
2026-04-23 00:12:51 +02:00
tliu93 8565534b73 Merge pull request 'fix ci test' (#5) from feature/add_separate_migration_container into main
pytest / test (push) Successful in 45s
docker-image / build-and-push (push) Successful in 3m40s
Reviewed-on: #5
2026-04-22 13:35:40 +02:00
tliu93 4acdd2dc60 fix ci test
pytest / test (push) Successful in 45s
pytest / test (pull_request) Successful in 44s
2026-04-22 13:31:26 +02:00
tliu93 c9af7530e5 Merge pull request 'change adoption to separate step' (#4) from feature/add_separate_migration_container into main
pytest / test (push) Failing after 44s
docker-image / build-and-push (push) Successful in 3m40s
Reviewed-on: #4
2026-04-22 13:28:30 +02:00
tliu93 a76d6bfb71 change adoption to separate step
pytest / test (push) Failing after 46s
pytest / test (pull_request) Failing after 45s
2026-04-22 13:28:00 +02:00
tliu93 35aee79d93 Restore legacy poo inbound dispatch
pytest / test (push) Successful in 43s
docker-image / build-and-push (push) Successful in 3m38s
2026-04-20 23:33:57 +02:00
tliu93 b9e7f51d51 Split compose dev build from registry deploy
pytest / test (push) Successful in 44s
2026-04-20 23:16:13 +02:00
tliu93 94747c75dd Align image publishing with repository path
pytest / test (push) Successful in 43s
docker-image / build-and-push (push) Successful in 3m37s
2026-04-20 23:05:27 +02:00
tliu93 7978a7e1e1 Add release Docker image workflow
pytest / test (push) Successful in 42s
docker-image / build-and-push (push) Successful in 3m42s
2026-04-20 22:18:54 +02:00
tliu93 e9e2034d30 Add Grafana to deployment compose
pytest / test (push) Successful in 40s
2026-04-20 20:50:46 +02:00
tliu93 aae8ca3b87 Merge pull request 'refactoring/new_python' (#3) from refactoring/new_python into main
pytest / test (push) Successful in 41s
Reviewed-on: #3
2026-04-20 20:41:01 +02:00
tliu93 1805d5d8ea Finalize first Python release
pytest / test (push) Successful in 40s
pytest / test (pull_request) Successful in 41s
2026-04-20 20:40:04 +02:00
tliu93 795c84f177 Stabilize auth tests in CI
pytest / test (push) Successful in 43s
2026-04-20 17:43:24 +02:00
tliu93 1ff426d2e9 Add pytest workflow
pytest / test (push) Failing after 2m18s
2026-04-20 17:38:32 +02:00
tliu93 fe0409dafe Refine runtime config and redirect settings 2026-04-20 17:36:05 +02:00
tliu93 982af62f4f Migrate TickTick OAuth and action tasks 2026-04-20 17:06:03 +02:00
tliu93 179aae264e Persist runtime config in app db and seed from env 2026-04-20 15:56:10 +02:00
tliu93 3f7c9e43d9 Switch auth password hashing to Argon2 2026-04-20 15:26:36 +02:00
tliu93 e1aad408ab Add auth foundation and app DB management 2026-04-20 15:16:47 +02:00
tliu93 044b47c573 Migrate poo recorder and align Alembic naming 2026-04-20 11:48:48 +02:00
tliu93 e334df992f Add Home Assistant inbound gateway 2026-04-20 10:42:35 +02:00
tliu93 151ad46275 Add Home Assistant outbound adapter 2026-04-20 10:11:02 +02:00
tliu93 eb487ccb46 Track exported OpenAPI schema 2026-04-19 23:25:13 +02:00
tliu93 d0dc8e893a Tighten location request validation 2026-04-19 23:18:20 +02:00
tliu93 1a2f9c75d9 Harden location db startup validation 2026-04-19 23:02:43 +02:00
tliu93 8aeb0723c1 Add location db adoption runbook 2026-04-19 21:57:31 +02:00
tliu93 32cc6847fd Migrate location recorder and refine db config 2026-04-19 21:39:23 +02:00
tliu93 31390882ef Bootstrap Python rewrite skeleton 2026-04-19 20:19:58 +02:00
137 changed files with 10721 additions and 2421 deletions
+10
View File
@@ -0,0 +1,10 @@
.git
.gitignore
.pytest_cache
.venv
__pycache__
*.pyc
data
openapi
src
+32
View File
@@ -0,0 +1,32 @@
# Required: bootstrap and core app settings.
# These values should be set before the container starts.
APP_NAME=Home Automation Backend (Python)
APP_ENV=production
APP_HOSTNAME=home-automation.example.com
APP_DATABASE_URL=sqlite:////app/data/app.db
LOCATION_DATABASE_URL=sqlite:////app/data/locationRecorder.db
POO_DATABASE_URL=sqlite:////app/data/pooRecorder.db
AUTH_BOOTSTRAP_USERNAME=admin
AUTH_BOOTSTRAP_PASSWORD=change-me
# Optional: runtime overrides.
# Leave these commented out to use the application's built-in defaults.
# APP_DEBUG=
# AUTH_SESSION_COOKIE_NAME=
# AUTH_SESSION_TTL_HOURS=
# AUTH_COOKIE_SECURE_OVERRIDE=
# Optional: Home Assistant integration.
# Leave these empty when Home Assistant integration is not needed.
HOME_ASSISTANT_BASE_URL=
HOME_ASSISTANT_AUTH_TOKEN=
POO_WEBHOOK_ID=
POO_SENSOR_ENTITY_NAME=
POO_SENSOR_FRIENDLY_NAME=
# Optional: TickTick integration.
# APP_HOSTNAME is used to derive the OAuth callback URI automatically.
TICKTICK_CLIENT_ID=
TICKTICK_CLIENT_SECRET=
TICKTICK_TOKEN=
HOME_ASSISTANT_ACTION_TASK_PROJECT_ID=
+43
View File
@@ -0,0 +1,43 @@
name: docker-image
on:
push:
tags:
- "v*"
env:
REGISTRY_HOST: code.wanderingbadger.dev
IMAGE_NAME: ${{ github.repository }}
jobs:
build-and-push:
runs-on: ubuntu-latest
steps:
- name: Check out repository
uses: actions/checkout@v4
- name: Set up QEMU
uses: docker/setup-qemu-action@v3
with:
platforms: amd64,arm64
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Log in to Gitea Container Registry
uses: docker/login-action@v3
with:
registry: ${{ env.REGISTRY_HOST }}
username: ${{ secrets.REGISTRY_USERNAME }}
password: ${{ secrets.REGISTRY_TOKEN }}
- name: Build and push multi-arch image
uses: docker/build-push-action@v6
with:
context: .
platforms: linux/amd64,linux/arm64
push: true
tags: |
${{ env.REGISTRY_HOST }}/${{ env.IMAGE_NAME }}:${{ github.ref_name }}
${{ env.REGISTRY_HOST }}/${{ env.IMAGE_NAME }}:latest
-22
View File
@@ -1,22 +0,0 @@
name: Run nightly tests
on:
schedule:
- cron: '0 20 * * *' # Every day at 20:00 UTC
push:
branches:
- main
jobs:
nightly-tests:
runs-on: [ubuntu-latest, cloud]
steps:
- uses: actions/checkout@v4
- name: Set up Go
uses: actions/setup-go@v4
with:
go-version: '1.23'
- name: Test
working-directory: ./src
run: go test -v --short ./...
+31
View File
@@ -0,0 +1,31 @@
name: pytest
on:
push:
branches:
- "**"
pull_request:
workflow_dispatch:
jobs:
test:
runs-on: ubuntu-latest
steps:
- name: Check out repository
uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: "3.13"
cache: pip
cache-dependency-path: |
requirements.txt
dev-requirements.txt
- name: Install dependencies
run: python -m pip install -r dev-requirements.txt
- name: Run pytest
run: python -m pytest
-21
View File
@@ -1,21 +0,0 @@
name: Run short tests
on:
push:
pull_request:
jobs:
run-tests:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Set up Go
uses: actions/setup-go@v4
with:
go-version: '1.24'
- name: Run short tests with coverage
working-directory: ./src
run: | # TODO: at this moment only Home Assistant component is tested
go test -v --short ./components/homeassistant/... -cover -coverprofile=cover.out
+5 -35
View File
@@ -1,37 +1,7 @@
# If you prefer the allow list template instead of the deny list, see community template:
# https://github.com/github/gitignore/blob/main/community/Golang/Go.AllowList.gitignore
#
# Binaries for programs and plugins
*.exe
*.exe~
*.dll
*.so
*.dylib
# Test binary, built with `go test -c`
*.test
# Output of the go coverage tool, specifically when used with LiteIDE
*.out
# Dependency directories (remove the comment below to include it)
# vendor/
# Go workspace file
go.work
go.work.sum
# env file
.codex
.env
temp_data/
# py file for branch switching
.venv
__pycache__/
.pytest_cache/
config.yaml
bin/
*.db
cover.html
.venv/
__pycache__/
*.pyc
data/
+11 -26
View File
@@ -1,35 +1,20 @@
{
// Use IntelliSense to learn about possible attributes.
// Hover to view descriptions of existing attributes.
// For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
"version": "0.2.0",
"configurations": [
{
"name": "Launch Package",
"type": "go",
"name": "Launch Python App",
"type": "debugpy",
"request": "launch",
"mode": "auto",
"program": "${workspaceFolder}"
},
{
"name": "Launch Poo Reverse",
"type": "go",
"request": "launch",
"mode": "auto",
"program": "${workspaceFolder}/src/helper/poo_recorder_helper/main.go",
"module": "uvicorn",
"args": [
"reverse"
]
},
{
"name": "Launch Home Automation",
"type": "go",
"request": "launch",
"mode": "auto",
"program": "${workspaceFolder}/src/main.go",
"args": [
"serve"
]
"app.main:app",
"--reload",
"--host",
"0.0.0.0",
"--port",
"8000"
],
"jinja": true
}
]
}
+26
View File
@@ -0,0 +1,26 @@
FROM python:3.12-slim
ENV PYTHONDONTWRITEBYTECODE=1 \
PYTHONUNBUFFERED=1
WORKDIR /app
COPY requirements.txt ./
RUN pip install --no-cache-dir -r requirements.txt
COPY app ./app
COPY alembic_app ./alembic_app
COPY alembic_app.ini ./
COPY alembic_location ./alembic_location
COPY alembic_location.ini ./
COPY alembic_poo ./alembic_poo
COPY alembic_poo.ini ./
COPY scripts ./scripts
COPY docker ./docker
COPY README.md ./
RUN mkdir -p /app/data
EXPOSE 8000
ENTRYPOINT ["/app/docker/entrypoint.sh"]
CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"]
+452 -1
View File
@@ -1,3 +1,454 @@
# Home Automation Backend
![CI-Short](https://code.wanderingbadger.dev/tliu93/home-automation-backend.git/actions/workflows/short-tests.yml/badge.svg)
这是当前 `home-automation` 项目的首个 Python 版本。
当前系统已经包含:
- FastAPI Web 应用与服务端模板页面
- SQLite + SQLAlchemy + Alembic 的三库结构
- username/password + server-side session 鉴权
- runtime config 页面与 app DB 持久化
- public IPv4 monitor、历史持久化与定时检查
- SMTP 配置、测试发信与 public IPv4 changed 邮件通知
- location recorder
- poo recorder
- Home Assistant inbound / outbound integration
- TickTick OAuth 与 action task 集成
- pytest 测试与 OpenAPI 导出脚本
- Docker / Compose 部署入口
当前明确不包含:
- Notion 模块
## 当前配置现实
当前系统仍然是三个独立的 SQLite 数据库文件,而不是单一数据库:
- `app` 级共享数据使用自己的 DB 文件
- `location` 模块使用自己的 DB 文件
- `poo` 模块使用自己的 DB 文件
当前阶段明确不借这次重构把这些 DB 合并。配置层已经显式反映这一点:
- `APP_DATABASE_URL`
- `LOCATION_DATABASE_URL`
- `POO_DATABASE_URL`
目前 auth、`location``poo` 都已经接到各自独立的数据库文件。
其中 `app` 级共享 DB 当前主要用于:
- 单个 admin 用户
- server-side session
- runtime config 持久化
- public IPv4 当前状态与变化历史
这部分现在也使用 Alembic 管理:
- `app db` 不会在应用启动时自动创建
- 需要先运行 `python scripts/app_db_adopt.py`
- 这个脚本会创建新 DB 并建好 schema
## 当前目录
主要目录如下:
- `app/`: FastAPI 应用代码
- `alembic_app/`: App DB 的 Alembic migration 环境
- `alembic_location/`: Location DB 的 Alembic migration 环境
- `alembic_poo/`: Poo DB 的 Alembic migration 环境
- `tests/`: pytest 测试
- `docs/`: 当前系统说明文档
- `scripts/`: 辅助脚本,例如 OpenAPI 导出
## 依赖管理
项目现在采用 `pip-tools` 管理依赖:
- 生产依赖源文件:`requirements.in`
- 开发依赖源文件:`dev-requirements.in`
- 编译产物:
- `requirements.txt`
- `dev-requirements.txt`
更新依赖时建议使用:
```bash
python -m venv .venv
source .venv/bin/activate
pip install pip-tools
pip-compile requirements.in
pip-compile dev-requirements.in
```
如果要升级某个依赖,可以用:
```bash
pip-compile --upgrade-package fastapi requirements.in
pip-compile dev-requirements.in
```
## 本地启动
建议使用 Python 3.11 或以上版本。
1. 创建虚拟环境并安装依赖
```bash
python -m venv .venv
source .venv/bin/activate
pip install -r dev-requirements.txt
```
2. 准备环境变量
```bash
cp .env.example .env
```
3. 初始化数据库
```bash
python -m scripts.run_migrations
```
4. 启动服务
```bash
uvicorn app.main:app --reload --host 0.0.0.0 --port 8000
```
启动后可访问:
- 应用首页:`http://localhost:8000/`
- 健康检查:`http://localhost:8000/status`
- Swagger UI`http://localhost:8000/docs`
- ReDoc`http://localhost:8000/redoc`
## 数据库与 Alembic
当前默认使用 SQLite,并区分三个数据库文件:
- App DB`sqlite:///./data/app.db`
- Location DB`sqlite:///./data/locationRecorder.db`
- Poo DB`sqlite:///./data/pooRecorder.db`
- 数据目录:`./data/`
初始化 migration 环境后,可继续添加模型并生成迁移:
当前 `app``location``poo` 都已经有各自独立的 Alembic 链路。
- App Alembic 环境:`alembic_app.ini` + `alembic_app/`
- Location Alembic 环境:`alembic_location.ini` + `alembic_location/`
- Poo Alembic 环境:`alembic_poo.ini` + `alembic_poo/`
- 统一 migration job`python -m scripts.run_migrations`
- App DB 初始化:`python scripts/app_db_adopt.py`
- Location DB 接管 / 初始化:`python scripts/location_db_adopt.py`
- Poo DB 接管 / 初始化:`python scripts/poo_db_adopt.py`
## 基础鉴权
当前项目提供一个单用户 admin 鉴权层,用于保护配置页面与管理能力。
- 认证模型:`username/password`
- 会话模型:server-side session + cookie
- 当前主要受保护页面:`/config`
- 当前公开页面:`/login`
- 当前公开 API:现有业务 API 暂未在这一轮统一收口到 auth 下
安全实现的当前边界:
- 密码使用 Argon2 做哈希存储
- session cookie 使用 `HttpOnly`
- `Secure` 默认随 `APP_ENV` 切换:非 development 时默认开启
- `SameSite=Lax`
- 登录表单和登出表单都有基础 CSRF 防护
首次启动时,如果 `APP_DATABASE_URL` 对应的 auth DB 里还没有用户,应用会使用:
- `AUTH_BOOTSTRAP_USERNAME`
- `AUTH_BOOTSTRAP_PASSWORD`
创建初始 admin 用户。当前默认就是:
- username: `admin`
- password: `admin`
首次登录后会被要求立即修改密码。这个 bootstrap 只用于首个用户落库,不是后续的完整配置管理方案。
当前前端主要有两条页面路径:
- `/login`
- `/config`
无论是本地 `host:port` 还是反向代理后的域名访问,登录成功后都使用相对路径跳转到 `/config`
## Config 持久化
当前 config 页面不会把修改写回 `.env`
当前原则是:
- `.env` 只负责 bootstrap / fallback
- app 启动先从 `.env` 读取数据库地址等基础配置
- 请求期读取配置时,优先使用 app DB 中的 `app_config`
- 如果数据库里没有对应值,再 fallback 到 `.env`
这意味着:
- location / poo / app DB 地址仍然属于 bootstrap 范畴
- 运行时可编辑配置主要通过 `app_config` 表持久化
- token / secret 这类运行时必须可取回的配置,目前允许明文存储在 config 表中
- 登录密码仍然单独使用 Argon2 哈希,不走 config 表明文存储
当前已经接入 config 页面的运行时配置包括:
- 基础系统配置
- auth cookie 相关配置
- SMTP 基础配置
- TickTick OAuth 配置
- Home Assistant 配置
其中 SMTP password 与其他 secret 字段一致:
- 页面不明文回显
- 留空提交时保留旧值
- 用于测试发信与自动通知时不会写入响应
## Public IPv4 Monitor
当前系统已经提供最小可用的 public IPv4 monitor
- 使用单一 provider 检查当前公网 IPv4
- 将状态与变化历史持久化到 app DB
- 提供受保护的手动检查入口:`GET /public-ip/check`
- 启动时注册 APScheduler job,默认每 4 小时检查一次
当前 app DB 中与此功能相关的新表:
- `public_ip_state`
- `public_ip_history`
状态语义如下:
- `first_seen`:首次发现当前公网 IPv4
- `unchanged`:与上次状态一致
- `changed`:公网 IPv4 发生变化
- `error`:provider 请求失败或返回无效值
## SMTP 与邮件通知
当前系统已经提供最小可用的 SMTP 能力:
- SMTP 配置可在 `/config` 页面填写并保存到 `app_config`
- 可通过 config 页面发送测试邮件
- 邮件 `From` 头支持显示名,例如 `Home Automation <sender@example.com>`
当前 SMTP 配置项包括:
- `SMTP_ENABLED`
- `SMTP_HOST`
- `SMTP_PORT`
- `SMTP_USERNAME`
- `SMTP_PASSWORD`
- `SMTP_FROM_NAME`
- `SMTP_FROM_ADDRESS`
- `SMTP_TO_ADDRESS`
- `SMTP_USE_STARTTLS`
当前 public IPv4 monitor 已与 SMTP sender 接通,但只处理一个很小的通知场景:
- 当 public IPv4 check 结果为 `changed` 时,自动发送一封英文纯文本邮件
以下情况不会发邮件:
- `first_seen`
- `unchanged`
- `error`
当前通知邮件内容固定,不提供模板系统,正文会包含:
- previous IP
- current IP
- detected time
手动测试时,如果需要再次模拟一次 IP 变化,可以临时修改 `public_ip_state.current_ipv4` 为一个保留测试地址,然后再次调用 `GET /public-ip/check`
## OpenAPI
可使用下面的脚本重新导出当前 API 定义:
```bash
python scripts/export_openapi.py
```
导出结果会写入:
- `openapi/openapi.json`
- `openapi/openapi.yaml`
## Docker Compose
当前默认 Compose 服务名为 `app`,容器名固定为 `home-automation-app`
当前 Compose 分成两层:
- `docker-compose.yml`:默认使用 registry image,适合部署 / 生产拉取
- `docker-compose.override.yml`:仅为本地开发追加 `build: .`
本地开发启动方式:
```bash
docker compose up -d --build
```
上面的命令会自动叠加 `docker-compose.override.yml`,因此本地仍然会按当前工作目录重新 build。
如果要按生产方式直接从 registry 拉取并启动,显式只使用基础 compose 文件:
```bash
docker compose -f docker-compose.yml pull
docker compose -f docker-compose.yml up -d
```
持续查看日志:
```bash
docker compose logs -f app
```
## Grafana Provisioning
当前仓库支持通过 Grafana provisioning 自动加载 SQLite datasource 和 repo 内的 dashboard 导出文件。
需要保留的文件路径如下:
- `grafana/provisioning/datasources/locationrecorder.yaml`
- `grafana/provisioning/datasources/poorecorder.yaml`
- `grafana/provisioning/dashboards/provider.yaml`
- `grafana/dashboards/locationrecorder.json`
- `grafana/dashboards/poorecorder.json`
这些文件的职责分别是:
- `grafana/provisioning/datasources/locationrecorder.yaml`:声明 `locationrecorder` SQLite datasource,并指向 `/data/home-automation/locationRecorder.db`
- `grafana/provisioning/datasources/poorecorder.yaml`:声明 `poorecorder` SQLite datasource,并指向 `/data/home-automation/pooRecorder.db`
- `grafana/provisioning/dashboards/provider.yaml`:告诉 Grafana 从 `/var/lib/grafana/dashboards` 扫描并加载 dashboard JSON
- `grafana/dashboards/locationrecorder.json`location recorder dashboard 导出文件,内容本身不需要在 compose 中改写
- `grafana/dashboards/poorecorder.json`poo recorder dashboard 导出文件,内容本身不需要在 compose 中改写
当前 `docker-compose.yml` 中,Grafana service 需要挂载以下目录:
- `./grafana/provisioning -> /etc/grafana/provisioning:ro`
- `./grafana/dashboards -> /var/lib/grafana/dashboards:ro`
同时保留现有 named volume `homeautomation_grafana_storage:/var/lib/grafana` 作为 Grafana 运行态数据存储。
一键启动前,至少需要以下文件已经存在:
- `grafana/provisioning/datasources/locationrecorder.yaml`
- `grafana/provisioning/datasources/poorecorder.yaml`
- `grafana/provisioning/dashboards/provider.yaml`
- `grafana/dashboards/locationrecorder.json`
- `grafana/dashboards/poorecorder.json`
启动方式:
```bash
docker compose up -d
```
启动后会发生的事情:
- Grafana 容器会安装 `frser-sqlite-datasource` 插件
- Grafana 会读取 `/etc/grafana/provisioning/datasources/` 下的 datasource YAML
- Grafana 会读取 `/etc/grafana/provisioning/dashboards/provider.yaml`
- Grafana 会从 `/var/lib/grafana/dashboards/` 自动导入两个 dashboard JSON
- 现有 Grafana named volume 继续负责保存 Grafana 运行态数据,不会覆盖 repo 内的 dashboard 与 provisioning 文件
## Container Image CI
项目提供了一个 release image workflow
- workflow 文件:`.github/workflows/docker-image.yml`
- 触发条件:push 匹配 `v*` 的 tag,例如 `v1.0.0`
- registry`code.wanderingbadger.dev`
- image`code.wanderingbadger.dev/<owner>/<repo>`
`docker-compose.yml` 中生产默认使用的 app image 当前为:
- `code.wanderingbadger.dev/tliu93/home-automation:latest`
当前 workflow 不再把 image name 硬编码到特定 user package 路径,而是直接使用当前仓库标识生成镜像路径:
- `code.wanderingbadger.dev/${github.repository}:${tag}`
在 Gitea 这里,package 更贴近 repo 归属的语义,主要体现在镜像命名路径本身,而不是额外的“绑定”动作。也就是说,当前发布方式是按仓库路径约定来对齐 repo/package 语义。
这个 workflow 会构建并推送 multi-arch image
- `linux/amd64`
- `linux/arm64`
推送的 tag
- release tag 本身,例如 `v1.0.0`
- `latest`
workflow 依赖以下 secrets
- `REGISTRY_USERNAME`
- `REGISTRY_TOKEN`
CI 产出的 image 是给部署机直接 `docker pull` 使用的。部署机不需要 checkout 本仓库,也不需要本地执行 `docker build`
## 运行测试
```bash
pytest
```
当前测试包含:
- app 基本启动测试
- `/status` endpoint 测试
- 登录 / session 基础流程测试
## OpenAPI 导出
FastAPI 默认会暴露 OpenAPI。若需要导出静态 schema 文件,可运行:
```bash
python scripts/export_openapi.py
```
输出文件会写到:
- `openapi/openapi.json`
- `openapi/openapi.yaml`
`openapi/` 当前纳入版本控制。接口发生变更时,应重新运行导出脚本并同步提交生成的 schema 文件。
## 容器启动
1. 准备环境变量文件
```bash
cp .env.example .env
```
2. 启动容器
```bash
docker compose up --build
```
默认端口:
- `8000:8000`
SQLite 持久化目录:
- 本地 `./data`
- 容器内 `/app/data`
+37
View File
@@ -0,0 +1,37 @@
[alembic]
script_location = alembic_app
prepend_sys_path = .
path_separator = os
sqlalchemy.url = sqlite:///./data/app.db
[loggers]
keys = root,sqlalchemy,alembic
[handlers]
keys = console
[formatters]
keys = generic
[logger_root]
level = WARN
handlers = console
[logger_sqlalchemy]
level = WARN
handlers =
qualname = sqlalchemy.engine
[logger_alembic]
level = INFO
handlers =
qualname = alembic
[handler_console]
class = StreamHandler
args = (sys.stderr,)
level = NOTSET
formatter = generic
[formatter_generic]
format = %(levelname)-5.5s [%(name)s] %(message)s
+50
View File
@@ -0,0 +1,50 @@
from logging.config import fileConfig
from alembic import context
from sqlalchemy import engine_from_config, pool
from app.auth_db import AuthBase
from app.config import get_settings
from app.models.config import AppConfigEntry # noqa: F401
from app.models.auth import AuthSession, AuthUser # noqa: F401
from app.models.public_ip import PublicIPHistory, PublicIPState # noqa: F401
config = context.config
if config.config_file_name is not None:
fileConfig(config.config_file_name)
settings = get_settings()
configured_url = config.get_main_option("sqlalchemy.url")
if not configured_url or configured_url == "sqlite:///./data/app.db":
config.set_main_option("sqlalchemy.url", settings.app_database_url)
target_metadata = AuthBase.metadata
def run_migrations_offline() -> None:
url = config.get_main_option("sqlalchemy.url")
context.configure(url=url, target_metadata=target_metadata, literal_binds=True)
with context.begin_transaction():
context.run_migrations()
def run_migrations_online() -> None:
connectable = engine_from_config(
config.get_section(config.config_ini_section, {}),
prefix="sqlalchemy.",
poolclass=pool.NullPool,
)
with connectable.connect() as connection:
context.configure(connection=connection, target_metadata=target_metadata)
with context.begin_transaction():
context.run_migrations()
if context.is_offline_mode():
run_migrations_offline()
else:
run_migrations_online()
+25
View File
@@ -0,0 +1,25 @@
"""${message}
Revision ID: ${up_revision}
Revises: ${down_revision | comma,n}
Create Date: ${create_date}
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision: str = ${repr(up_revision)}
down_revision: Union[str, None] = ${repr(down_revision)}
branch_labels: Union[str, Sequence[str], None] = ${repr(branch_labels)}
depends_on: Union[str, Sequence[str], None] = ${repr(depends_on)}
def upgrade() -> None:
${upgrades if upgrades else "pass"}
def downgrade() -> None:
${downgrades if downgrades else "pass"}
@@ -0,0 +1,56 @@
"""app auth baseline
Revision ID: 20260420_03_app_auth_baseline
Revises:
Create Date: 2026-04-20 00:00:00.000000
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
revision: str = "20260420_03_app_auth_baseline"
down_revision: Union[str, None] = None
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
op.create_table(
"auth_users",
sa.Column("id", sa.Integer(), autoincrement=True, nullable=False),
sa.Column("username", sa.String(length=255), nullable=False),
sa.Column("password_hash", sa.String(length=255), nullable=False),
sa.Column("is_active", sa.Boolean(), nullable=False),
sa.Column("force_password_change", sa.Boolean(), nullable=False),
sa.Column("created_at", sa.DateTime(timezone=True), nullable=False),
sa.PrimaryKeyConstraint("id"),
)
op.create_index(op.f("ix_auth_users_username"), "auth_users", ["username"], unique=True)
op.create_table(
"auth_sessions",
sa.Column("id", sa.Integer(), autoincrement=True, nullable=False),
sa.Column("user_id", sa.Integer(), nullable=False),
sa.Column("token_hash", sa.String(length=64), nullable=False),
sa.Column("csrf_token", sa.String(length=128), nullable=False),
sa.Column("created_at", sa.DateTime(timezone=True), nullable=False),
sa.Column("expires_at", sa.DateTime(timezone=True), nullable=False),
sa.Column("revoked_at", sa.DateTime(timezone=True), nullable=True),
sa.ForeignKeyConstraint(["user_id"], ["auth_users.id"]),
sa.PrimaryKeyConstraint("id"),
)
op.create_index(op.f("ix_auth_sessions_expires_at"), "auth_sessions", ["expires_at"], unique=False)
op.create_index(op.f("ix_auth_sessions_token_hash"), "auth_sessions", ["token_hash"], unique=True)
op.create_index(op.f("ix_auth_sessions_user_id"), "auth_sessions", ["user_id"], unique=False)
def downgrade() -> None:
op.drop_index(op.f("ix_auth_sessions_user_id"), table_name="auth_sessions")
op.drop_index(op.f("ix_auth_sessions_token_hash"), table_name="auth_sessions")
op.drop_index(op.f("ix_auth_sessions_expires_at"), table_name="auth_sessions")
op.drop_table("auth_sessions")
op.drop_index(op.f("ix_auth_users_username"), table_name="auth_users")
op.drop_table("auth_users")
@@ -0,0 +1,34 @@
"""app config table
Revision ID: 20260420_04_app_config_table
Revises: 20260420_03_app_auth_baseline
Create Date: 2026-04-20 00:00:01.000000
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
revision: str = "20260420_04_app_config_table"
down_revision: Union[str, None] = "20260420_03_app_auth_baseline"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
op.create_table(
"app_config",
sa.Column("id", sa.Integer(), autoincrement=True, nullable=False),
sa.Column("key", sa.String(length=255), nullable=False),
sa.Column("value", sa.String(), nullable=False),
sa.Column("updated_at", sa.DateTime(timezone=True), nullable=False),
sa.PrimaryKeyConstraint("id"),
)
op.create_index(op.f("ix_app_config_key"), "app_config", ["key"], unique=True)
def downgrade() -> None:
op.drop_index(op.f("ix_app_config_key"), table_name="app_config")
op.drop_table("app_config")
@@ -0,0 +1,55 @@
"""public ip monitor tables
Revision ID: 20260429_05_public_ip_monitor
Revises: 20260420_04_app_config_table
Create Date: 2026-04-29 00:00:01.000000
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
revision: str = "20260429_05_public_ip_monitor"
down_revision: Union[str, None] = "20260420_04_app_config_table"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
op.create_table(
"public_ip_history",
sa.Column("id", sa.Integer(), autoincrement=True, nullable=False),
sa.Column("ipv4", sa.String(length=45), nullable=False),
sa.Column("observed_at", sa.DateTime(timezone=True), nullable=False),
sa.Column("change_type", sa.String(length=32), nullable=False),
sa.Column("provider", sa.String(length=64), nullable=True),
sa.PrimaryKeyConstraint("id"),
)
op.create_index(
"ix_public_ip_history_observed_at",
"public_ip_history",
["observed_at"],
unique=False,
)
op.create_table(
"public_ip_state",
sa.Column("id", sa.Integer(), autoincrement=True, nullable=False),
sa.Column("current_ipv4", sa.String(length=45), nullable=False),
sa.Column("previous_ipv4", sa.String(length=45), nullable=True),
sa.Column("first_seen_at", sa.DateTime(timezone=True), nullable=False),
sa.Column("last_checked_at", sa.DateTime(timezone=True), nullable=False),
sa.Column("last_changed_at", sa.DateTime(timezone=True), nullable=True),
sa.Column("last_check_status", sa.String(length=32), nullable=False),
sa.Column("last_check_error", sa.String(length=255), nullable=True),
sa.Column("last_provider", sa.String(length=64), nullable=True),
sa.PrimaryKeyConstraint("id"),
)
def downgrade() -> None:
op.drop_table("public_ip_state")
op.drop_index("ix_public_ip_history_observed_at", table_name="public_ip_history")
op.drop_table("public_ip_history")
+37
View File
@@ -0,0 +1,37 @@
[alembic]
script_location = alembic_location
prepend_sys_path = .
path_separator = os
sqlalchemy.url = sqlite:///./data/locationRecorder.db
[loggers]
keys = root,sqlalchemy,alembic
[handlers]
keys = console
[formatters]
keys = generic
[logger_root]
level = WARN
handlers = console
[logger_sqlalchemy]
level = WARN
handlers =
qualname = sqlalchemy.engine
[logger_alembic]
level = INFO
handlers =
qualname = alembic
[handler_console]
class = StreamHandler
args = (sys.stderr,)
level = NOTSET
formatter = generic
[formatter_generic]
format = %(levelname)-5.5s [%(name)s] %(message)s
+2
View File
@@ -0,0 +1,2 @@
This directory contains the Alembic migration environment for the Python rewrite skeleton.
+48
View File
@@ -0,0 +1,48 @@
from logging.config import fileConfig
from alembic import context
from sqlalchemy import engine_from_config, pool
from app.config import get_settings
from app.models import Location # noqa: F401
from app.models.base import Base
config = context.config
if config.config_file_name is not None:
fileConfig(config.config_file_name)
settings = get_settings()
configured_url = config.get_main_option("sqlalchemy.url")
if not configured_url or configured_url == "sqlite:///./data/locationRecorder.db":
config.set_main_option("sqlalchemy.url", settings.location_database_url)
target_metadata = Base.metadata
def run_migrations_offline() -> None:
url = config.get_main_option("sqlalchemy.url")
context.configure(url=url, target_metadata=target_metadata, literal_binds=True)
with context.begin_transaction():
context.run_migrations()
def run_migrations_online() -> None:
connectable = engine_from_config(
config.get_section(config.config_ini_section, {}),
prefix="sqlalchemy.",
poolclass=pool.NullPool,
)
with connectable.connect() as connection:
context.configure(connection=connection, target_metadata=target_metadata)
with context.begin_transaction():
context.run_migrations()
if context.is_offline_mode():
run_migrations_offline()
else:
run_migrations_online()
+26
View File
@@ -0,0 +1,26 @@
"""${message}
Revision ID: ${up_revision}
Revises: ${down_revision | comma,n}
Create Date: ${create_date}
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision: str = ${repr(up_revision)}
down_revision: Union[str, None] = ${repr(down_revision)}
branch_labels: Union[str, Sequence[str], None] = ${repr(branch_labels)}
depends_on: Union[str, Sequence[str], None] = ${repr(depends_on)}
def upgrade() -> None:
${upgrades if upgrades else "pass"}
def downgrade() -> None:
${downgrades if downgrades else "pass"}
+1
View File
@@ -0,0 +1 @@
@@ -0,0 +1,33 @@
"""location baseline
Revision ID: 20260419_01_location_baseline
Revises:
Create Date: 2026-04-19 00:00:00.000000
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
revision: str = "20260419_01_location_baseline"
down_revision: Union[str, None] = None
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
op.create_table(
"location",
sa.Column("person", sa.Text(), nullable=False),
sa.Column("datetime", sa.Text(), nullable=False),
sa.Column("latitude", sa.Float(), nullable=False),
sa.Column("longitude", sa.Float(), nullable=False),
sa.Column("altitude", sa.Float(), nullable=True),
sa.PrimaryKeyConstraint("person", "datetime"),
)
def downgrade() -> None:
op.drop_table("location")
+37
View File
@@ -0,0 +1,37 @@
[alembic]
script_location = alembic_poo
prepend_sys_path = .
path_separator = os
sqlalchemy.url = sqlite:///./data/pooRecorder.db
[loggers]
keys = root,sqlalchemy,alembic
[handlers]
keys = console
[formatters]
keys = generic
[logger_root]
level = WARN
handlers = console
[logger_sqlalchemy]
level = WARN
handlers =
qualname = sqlalchemy.engine
[logger_alembic]
level = INFO
handlers = console
qualname = alembic
[handler_console]
class = StreamHandler
args = (sys.stderr,)
level = NOTSET
formatter = generic
[formatter_generic]
format = %(levelname)-5.5s [%(name)s] %(message)s
+48
View File
@@ -0,0 +1,48 @@
from logging.config import fileConfig
from alembic import context
from sqlalchemy import engine_from_config, pool
from app.config import get_settings
from app.models.poo import PooRecord # noqa: F401
from app.poo_db import PooBase
config = context.config
if config.config_file_name is not None:
fileConfig(config.config_file_name)
settings = get_settings()
configured_url = config.get_main_option("sqlalchemy.url")
if not configured_url or configured_url == "sqlite:///./data/pooRecorder.db":
config.set_main_option("sqlalchemy.url", settings.poo_database_url)
target_metadata = PooBase.metadata
def run_migrations_offline() -> None:
url = config.get_main_option("sqlalchemy.url")
context.configure(url=url, target_metadata=target_metadata, literal_binds=True)
with context.begin_transaction():
context.run_migrations()
def run_migrations_online() -> None:
connectable = engine_from_config(
config.get_section(config.config_ini_section, {}),
prefix="sqlalchemy.",
poolclass=pool.NullPool,
)
with connectable.connect() as connection:
context.configure(connection=connection, target_metadata=target_metadata)
with context.begin_transaction():
context.run_migrations()
if context.is_offline_mode():
run_migrations_offline()
else:
run_migrations_online()
@@ -0,0 +1,32 @@
"""poo baseline
Revision ID: 20260420_01_poo_baseline
Revises:
Create Date: 2026-04-20 00:00:00.000000
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
revision: str = "20260420_01_poo_baseline"
down_revision: Union[str, None] = None
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
op.create_table(
"poo_records",
sa.Column("timestamp", sa.Text(), nullable=False),
sa.Column("status", sa.Text(), nullable=False),
sa.Column("latitude", sa.Float(), nullable=False),
sa.Column("longitude", sa.Float(), nullable=False),
sa.PrimaryKeyConstraint("timestamp"),
)
def downgrade() -> None:
op.drop_table("poo_records")
+2
View File
@@ -0,0 +1,2 @@
"""Application package for the home automation backend."""
+2
View File
@@ -0,0 +1,2 @@
"""API package."""
+2
View File
@@ -0,0 +1,2 @@
"""Route modules."""
+234
View File
@@ -0,0 +1,234 @@
import logging
from pathlib import Path
from fastapi import APIRouter, Depends, Form, Request, status
from fastapi.responses import HTMLResponse, RedirectResponse, Response
from fastapi.templating import Jinja2Templates
from sqlalchemy.orm import Session
from app.config import Settings
from app.dependencies import get_app_settings, get_auth_db, get_current_auth_session
from app.services.auth import (
AuthenticatedSession,
authenticate_user,
change_password,
create_session,
AuthPasswordChangeError,
issue_login_csrf_token,
revoke_session,
validate_csrf_token,
)
from app.services.config_page import build_config_sections, is_ticktick_oauth_ready
logger = logging.getLogger(__name__)
templates = Jinja2Templates(directory=str(Path(__file__).resolve().parents[2] / "templates"))
router = APIRouter(tags=["auth"])
LOGIN_CSRF_COOKIE_NAME = "login_csrf"
@router.get("/login", response_class=HTMLResponse)
def login_page(
request: Request,
settings: Settings = Depends(get_app_settings),
current_auth: AuthenticatedSession | None = Depends(get_current_auth_session),
) -> Response:
if current_auth is not None:
return RedirectResponse(url="/config", status_code=status.HTTP_303_SEE_OTHER)
csrf_token = issue_login_csrf_token()
response = templates.TemplateResponse(
request,
"login.html",
{
"app_name": settings.app_name,
"app_env": settings.app_env,
"csrf_token": csrf_token,
"error_message": None,
},
)
_set_login_csrf_cookie(response, settings=settings, token=csrf_token)
return response
@router.post("/login", response_class=HTMLResponse)
def login_submit(
request: Request,
username: str = Form(),
password: str = Form(),
csrf_token: str = Form(),
session: Session = Depends(get_auth_db),
settings: Settings = Depends(get_app_settings),
) -> Response:
cookie_csrf_token = request.cookies.get(LOGIN_CSRF_COOKIE_NAME)
if not validate_csrf_token(expected=cookie_csrf_token, actual=csrf_token):
logger.warning("Rejected login attempt due to CSRF validation failure")
return _render_login_error(
request,
settings=settings,
status_code=status.HTTP_400_BAD_REQUEST,
error_message="invalid login request",
)
user = authenticate_user(session, username=username, password=password)
if user is None:
return _render_login_error(
request,
settings=settings,
status_code=status.HTTP_401_UNAUTHORIZED,
error_message="invalid username or password",
)
auth_session, raw_token = create_session(session, user=user, settings=settings)
response = RedirectResponse(url="/config", status_code=status.HTTP_303_SEE_OTHER)
response.delete_cookie(LOGIN_CSRF_COOKIE_NAME, path="/login")
response.set_cookie(
key=settings.auth_session_cookie_name,
value=raw_token,
max_age=settings.auth_session_ttl_hours * 3600,
httponly=True,
secure=settings.auth_cookie_secure,
samesite="lax",
path="/",
)
logger.info("Created authenticated session for user '%s'", user.username)
return response
@router.post("/config/change-password", response_class=HTMLResponse)
def change_password_submit(
request: Request,
current_password: str = Form(),
new_password: str = Form(),
confirm_password: str = Form(),
csrf_token: str = Form(),
session: Session = Depends(get_auth_db),
settings: Settings = Depends(get_app_settings),
current_auth: AuthenticatedSession | None = Depends(get_current_auth_session),
) -> Response:
if current_auth is None:
return RedirectResponse(url="/login", status_code=status.HTTP_303_SEE_OTHER)
if not validate_csrf_token(expected=current_auth.session.csrf_token, actual=csrf_token):
logger.warning("Rejected password change attempt due to CSRF validation failure")
return _render_config_page(
request,
settings=settings,
auth_db_session=session,
current_auth=current_auth,
status_code=status.HTTP_400_BAD_REQUEST,
password_change_error="invalid password change request",
)
try:
change_password(
session,
user=current_auth.user,
current_password=current_password,
new_password=new_password,
confirm_password=confirm_password,
)
except AuthPasswordChangeError as exc:
logger.info(
"Rejected password change for user '%s': %s",
current_auth.user.username,
exc,
)
return _render_config_page(
request,
settings=settings,
auth_db_session=session,
current_auth=current_auth,
status_code=status.HTTP_400_BAD_REQUEST,
password_change_error="password change failed",
)
logger.info("Password updated for user '%s'", current_auth.user.username)
return RedirectResponse(url="/config", status_code=status.HTTP_303_SEE_OTHER)
@router.post("/logout")
def logout(
request: Request,
csrf_token: str = Form(),
session: Session = Depends(get_auth_db),
settings: Settings = Depends(get_app_settings),
current_auth: AuthenticatedSession | None = Depends(get_current_auth_session),
) -> RedirectResponse:
if current_auth is not None and validate_csrf_token(
expected=current_auth.session.csrf_token, actual=csrf_token
):
revoke_session(session, auth_session=current_auth.session)
logger.info("Revoked authenticated session for user '%s'", current_auth.user.username)
else:
logger.warning("Rejected logout request due to missing session or invalid CSRF token")
response = RedirectResponse(url="/login", status_code=status.HTTP_303_SEE_OTHER)
response.delete_cookie(settings.auth_session_cookie_name, path="/")
return response
def _render_login_error(
request: Request,
*,
settings: Settings,
status_code: int,
error_message: str,
) -> HTMLResponse:
csrf_token = issue_login_csrf_token()
response = templates.TemplateResponse(
request,
"login.html",
{
"app_name": settings.app_name,
"app_env": settings.app_env,
"csrf_token": csrf_token,
"error_message": error_message,
},
status_code=status_code,
)
_set_login_csrf_cookie(response, settings=settings, token=csrf_token)
return response
def _set_login_csrf_cookie(response: HTMLResponse, *, settings: Settings, token: str) -> None:
response.set_cookie(
key=LOGIN_CSRF_COOKIE_NAME,
value=token,
max_age=1800,
httponly=True,
secure=settings.auth_cookie_secure,
samesite="lax",
path="/login",
)
def _render_config_page(
request: Request,
*,
settings: Settings,
auth_db_session: Session,
current_auth: AuthenticatedSession,
status_code: int,
password_change_error: str | None,
) -> HTMLResponse:
return templates.TemplateResponse(
request,
"config.html",
{
"app_name": settings.app_name,
"app_env": settings.app_env,
"current_username": current_auth.user.username,
"csrf_token": current_auth.session.csrf_token,
"force_password_change": current_auth.user.force_password_change,
"password_change_error": password_change_error,
"config_error": None,
"config_saved": False,
"config_sections": build_config_sections(auth_db_session, settings),
"ticktick_oauth_ready": is_ticktick_oauth_ready(settings),
"ticktick_redirect_uri": settings.ticktick_redirect_uri,
"ticktick_oauth_notice": None,
"ticktick_oauth_error": None,
},
status_code=status_code,
)
+86
View File
@@ -0,0 +1,86 @@
import json
import logging
from fastapi import APIRouter, Depends, Request, status
from fastapi.responses import PlainTextResponse, Response
from pydantic import ValidationError
from sqlalchemy.orm import Session
from app.config import Settings
from app.dependencies import (
get_app_settings,
get_db,
get_homeassistant_client,
get_poo_db,
get_ticktick_client,
)
from app.integrations.homeassistant import (
HomeAssistantClient,
HomeAssistantConfigError,
HomeAssistantRequestError,
)
from app.integrations.ticktick import TickTickClient, TickTickConfigError, TickTickRequestError
from app.schemas.homeassistant import HomeAssistantPublishEnvelope
from app.services.homeassistant_inbound import (
UnsupportedHomeAssistantMessage,
handle_homeassistant_message,
)
router = APIRouter(tags=["homeassistant"])
logger = logging.getLogger(__name__)
BAD_REQUEST_MESSAGE = "bad request"
INTERNAL_SERVER_ERROR_MESSAGE = "internal server error"
@router.post("/homeassistant/publish")
async def publish_from_homeassistant(
request: Request,
db: Session = Depends(get_db),
poo_db: Session = Depends(get_poo_db),
settings: Settings = Depends(get_app_settings),
homeassistant_client: HomeAssistantClient = Depends(get_homeassistant_client),
ticktick_client: TickTickClient = Depends(get_ticktick_client),
) -> Response:
try:
raw_payload = await request.body()
data = json.loads(raw_payload)
envelope = HomeAssistantPublishEnvelope.model_validate(data)
handle_homeassistant_message(
db,
envelope,
ticktick_client=ticktick_client,
poo_session=poo_db,
settings=settings,
homeassistant_client=homeassistant_client,
)
except json.JSONDecodeError as exc:
logger.warning("Rejected Home Assistant publish request due to invalid JSON: %s", exc)
return PlainTextResponse(BAD_REQUEST_MESSAGE, status_code=status.HTTP_400_BAD_REQUEST)
except ValidationError as exc:
logger.warning(
"Rejected Home Assistant publish request due to validation failure: %s", exc
)
return PlainTextResponse(BAD_REQUEST_MESSAGE, status_code=status.HTTP_400_BAD_REQUEST)
except UnsupportedHomeAssistantMessage as exc:
logger.warning("Home Assistant publish target/action unsupported: %s", exc)
return PlainTextResponse(
INTERNAL_SERVER_ERROR_MESSAGE,
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
)
except (
TickTickConfigError,
TickTickRequestError,
HomeAssistantConfigError,
HomeAssistantRequestError,
RuntimeError,
) as exc:
logger.warning("Home Assistant publish request failed during integration handling: %s", exc)
return PlainTextResponse(
INTERNAL_SERVER_ERROR_MESSAGE,
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
)
except ValueError as exc:
logger.warning("Rejected Home Assistant publish request due to invalid content: %s", exc)
return PlainTextResponse(BAD_REQUEST_MESSAGE, status_code=status.HTTP_400_BAD_REQUEST)
return Response(status_code=status.HTTP_200_OK)
+35
View File
@@ -0,0 +1,35 @@
import json
import logging
from fastapi import APIRouter, Depends, Request, status
from fastapi.responses import PlainTextResponse, Response
from pydantic import ValidationError
from sqlalchemy.orm import Session
from app.dependencies import get_db
from app.schemas.location import LocationRecordRequest
from app.services.location import record_location
router = APIRouter(tags=["location"])
logger = logging.getLogger(__name__)
BAD_REQUEST_MESSAGE = "bad request"
@router.post("/location/record")
async def create_location_record(request: Request, db: Session = Depends(get_db)) -> Response:
try:
raw_payload = await request.body()
data = json.loads(raw_payload)
payload = LocationRecordRequest.model_validate(data)
record_location(db, payload)
except json.JSONDecodeError as exc:
logger.warning("Rejected location request due to invalid JSON: %s", exc)
return PlainTextResponse(BAD_REQUEST_MESSAGE, status_code=status.HTTP_400_BAD_REQUEST)
except ValidationError as exc:
logger.warning("Rejected location request due to payload validation failure: %s", exc)
return PlainTextResponse(BAD_REQUEST_MESSAGE, status_code=status.HTTP_400_BAD_REQUEST)
except ValueError as exc:
logger.warning("Rejected location request due to invalid numeric input: %s", exc)
return PlainTextResponse(BAD_REQUEST_MESSAGE, status_code=status.HTTP_400_BAD_REQUEST)
return Response(status_code=status.HTTP_200_OK)
+240
View File
@@ -0,0 +1,240 @@
import logging
from pathlib import Path
from fastapi import APIRouter, Depends, Request, status
from fastapi.responses import HTMLResponse, RedirectResponse, Response
from fastapi.templating import Jinja2Templates
from app.config import Settings, get_settings
from app.dependencies import get_app_settings, get_auth_db, get_current_auth_session
from app.services.auth import AuthenticatedSession
from app.services.config_page import (
ConfigSaveError,
build_config_sections,
is_ticktick_oauth_ready,
save_config_updates,
)
from app.services.email import EmailConfigurationError, EmailDeliveryError, is_smtp_ready, send_smtp_test_email
from sqlalchemy.orm import Session
templates = Jinja2Templates(directory=str(Path(__file__).resolve().parents[2] / "templates"))
router = APIRouter(tags=["pages"])
logger = logging.getLogger(__name__)
def _ticktick_oauth_notice(status_value: str | None) -> tuple[str | None, str | None]:
if status_value == "success":
return "TickTick authorization completed successfully.", None
if status_value == "invalid-state":
return None, "TickTick authorization failed due to invalid OAuth state. Start the flow again."
if status_value == "invalid-callback":
return None, "TickTick authorization callback was missing required parameters."
if status_value == "failed":
return None, "TickTick authorization failed. Check server logs for the provider response and verify TickTick app credentials and redirect URI."
return None, None
def _smtp_test_notice(status_value: str | None) -> tuple[str | None, str | None]:
if status_value == "success":
return "SMTP test email sent successfully.", None
if status_value == "config-error":
return None, "SMTP test failed. Check required SMTP settings before sending a test email."
if status_value == "failed":
return None, "SMTP test failed. Check saved SMTP settings and server reachability."
return None, None
def _build_config_context(
*,
auth_db_session: Session,
settings: Settings,
current_auth: AuthenticatedSession,
config_saved: bool,
config_error: str | None,
password_change_error: str | None,
ticktick_oauth_notice: str | None,
ticktick_oauth_error: str | None,
smtp_test_notice: str | None,
smtp_test_error: str | None,
) -> dict[str, object]:
return {
"app_name": settings.app_name,
"app_env": settings.app_env,
"current_username": current_auth.user.username,
"csrf_token": current_auth.session.csrf_token,
"force_password_change": current_auth.user.force_password_change,
"password_change_error": password_change_error,
"config_error": config_error,
"config_saved": config_saved,
"config_sections": build_config_sections(auth_db_session, settings),
"ticktick_oauth_ready": is_ticktick_oauth_ready(settings),
"ticktick_redirect_uri": settings.ticktick_redirect_uri,
"ticktick_oauth_notice": ticktick_oauth_notice,
"ticktick_oauth_error": ticktick_oauth_error,
"smtp_test_ready": is_smtp_ready(settings),
"smtp_test_notice": smtp_test_notice,
"smtp_test_error": smtp_test_error,
}
@router.get("/", response_class=HTMLResponse)
def home(
request: Request,
current_auth: AuthenticatedSession | None = Depends(get_current_auth_session),
) -> RedirectResponse:
if current_auth is None:
return RedirectResponse(url="/login", status_code=status.HTTP_303_SEE_OTHER)
return RedirectResponse(url="/config", status_code=status.HTTP_303_SEE_OTHER)
@router.get("/admin", response_class=HTMLResponse)
def admin_redirect(
request: Request,
current_auth: AuthenticatedSession | None = Depends(get_current_auth_session),
) -> RedirectResponse:
if current_auth is None:
return RedirectResponse(url="/login", status_code=status.HTTP_303_SEE_OTHER)
return RedirectResponse(url="/config", status_code=status.HTTP_303_SEE_OTHER)
@router.get("/config", response_class=HTMLResponse)
def config_page(
request: Request,
auth_db_session: Session = Depends(get_auth_db),
settings: Settings = Depends(get_app_settings),
current_auth: AuthenticatedSession | None = Depends(get_current_auth_session),
) -> Response:
if current_auth is None:
return RedirectResponse(url="/login", status_code=status.HTTP_303_SEE_OTHER)
ticktick_oauth_notice, ticktick_oauth_error = _ticktick_oauth_notice(
request.query_params.get("ticktick_oauth")
)
smtp_test_notice, smtp_test_error = _smtp_test_notice(request.query_params.get("smtp_test"))
context = _build_config_context(
auth_db_session=auth_db_session,
settings=settings,
current_auth=current_auth,
config_saved=request.query_params.get("saved") == "1",
config_error=None,
password_change_error=None,
ticktick_oauth_notice=ticktick_oauth_notice,
ticktick_oauth_error=ticktick_oauth_error,
smtp_test_notice=smtp_test_notice,
smtp_test_error=smtp_test_error,
)
return templates.TemplateResponse(request, "config.html", context)
@router.post("/config", response_class=HTMLResponse)
async def config_submit(
request: Request,
auth_db_session: Session = Depends(get_auth_db),
settings: Settings = Depends(get_app_settings),
current_auth: AuthenticatedSession | None = Depends(get_current_auth_session),
) -> Response:
if current_auth is None:
return RedirectResponse(url="/login", status_code=status.HTTP_303_SEE_OTHER)
form = await request.form()
csrf_token = form.get("csrf_token")
if csrf_token != current_auth.session.csrf_token:
logger.warning("Rejected config update due to CSRF validation failure")
context = _build_config_context(
auth_db_session=auth_db_session,
settings=settings,
current_auth=current_auth,
config_saved=False,
config_error="invalid config update request",
password_change_error=None,
ticktick_oauth_notice=None,
ticktick_oauth_error=None,
smtp_test_notice=None,
smtp_test_error=None,
)
return templates.TemplateResponse(
request,
"config.html",
context,
status_code=status.HTTP_400_BAD_REQUEST,
)
try:
save_config_updates(auth_db_session, dict(form), settings)
except ConfigSaveError:
logger.warning("Rejected config update due to invalid submitted values")
refreshed_settings = get_settings()
context = _build_config_context(
auth_db_session=auth_db_session,
settings=refreshed_settings,
current_auth=current_auth,
config_saved=False,
config_error="invalid config submission",
password_change_error=None,
ticktick_oauth_notice=None,
ticktick_oauth_error=None,
smtp_test_notice=None,
smtp_test_error=None,
)
return templates.TemplateResponse(
request,
"config.html",
context,
status_code=status.HTTP_400_BAD_REQUEST,
)
return RedirectResponse(url="/config?saved=1", status_code=status.HTTP_303_SEE_OTHER)
@router.post("/config/smtp/test", response_class=HTMLResponse)
async def smtp_test_submit(
request: Request,
auth_db_session: Session = Depends(get_auth_db),
settings: Settings = Depends(get_app_settings),
current_auth: AuthenticatedSession | None = Depends(get_current_auth_session),
) -> Response:
if current_auth is None:
return RedirectResponse(url="/login", status_code=status.HTTP_303_SEE_OTHER)
form = await request.form()
csrf_token = form.get("csrf_token")
if csrf_token != current_auth.session.csrf_token:
logger.warning("Rejected SMTP test due to CSRF validation failure")
context = _build_config_context(
auth_db_session=auth_db_session,
settings=settings,
current_auth=current_auth,
config_saved=False,
config_error=None,
password_change_error=None,
ticktick_oauth_notice=None,
ticktick_oauth_error=None,
smtp_test_notice=None,
smtp_test_error="invalid SMTP test request",
)
return templates.TemplateResponse(
request,
"config.html",
context,
status_code=status.HTTP_400_BAD_REQUEST,
)
try:
send_smtp_test_email(settings)
except EmailConfigurationError as exc:
logger.warning("SMTP test email rejected due to configuration: %s", exc)
return RedirectResponse(
url="/config?smtp_test=config-error",
status_code=status.HTTP_303_SEE_OTHER,
)
except EmailDeliveryError as exc:
logger.warning("SMTP test email failed: %s", exc)
return RedirectResponse(
url="/config?smtp_test=failed",
status_code=status.HTTP_303_SEE_OTHER,
)
return RedirectResponse(
url="/config?smtp_test=success",
status_code=status.HTTP_303_SEE_OTHER,
)
+76
View File
@@ -0,0 +1,76 @@
import json
import logging
from fastapi import APIRouter, Depends, Request, status
from fastapi.responses import PlainTextResponse, Response
from pydantic import ValidationError
from sqlalchemy.orm import Session
from app.config import Settings
from app.dependencies import get_app_settings, get_homeassistant_client, get_poo_db
from app.integrations.homeassistant import HomeAssistantClient
from app.schemas.poo import PooRecordRequest
from app.services.poo import publish_latest_poo_status, record_poo
router = APIRouter(tags=["poo"])
logger = logging.getLogger(__name__)
BAD_REQUEST_MESSAGE = "bad request"
INTERNAL_SERVER_ERROR_MESSAGE = "internal server error"
@router.post("/poo/record")
async def create_poo_record(
request: Request,
db: Session = Depends(get_poo_db),
settings: Settings = Depends(get_app_settings),
homeassistant_client: HomeAssistantClient = Depends(get_homeassistant_client),
) -> Response:
try:
raw_payload = await request.body()
data = json.loads(raw_payload)
payload = PooRecordRequest.model_validate(data)
record_poo(
db,
payload,
settings=settings,
homeassistant_client=homeassistant_client,
)
except json.JSONDecodeError as exc:
logger.warning("Rejected poo record request due to invalid JSON: %s", exc)
return PlainTextResponse(BAD_REQUEST_MESSAGE, status_code=status.HTTP_400_BAD_REQUEST)
except ValidationError as exc:
logger.warning("Rejected poo record request due to validation failure: %s", exc)
return PlainTextResponse(BAD_REQUEST_MESSAGE, status_code=status.HTTP_400_BAD_REQUEST)
except ValueError as exc:
logger.warning("Rejected poo record request due to invalid numeric input: %s", exc)
return PlainTextResponse(BAD_REQUEST_MESSAGE, status_code=status.HTTP_400_BAD_REQUEST)
except Exception as exc:
logger.warning("Failed to store poo record: %s", exc)
return PlainTextResponse(
INTERNAL_SERVER_ERROR_MESSAGE,
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
)
return Response(status_code=status.HTTP_200_OK)
@router.get("/poo/latest")
def notify_latest_poo(
db: Session = Depends(get_poo_db),
settings: Settings = Depends(get_app_settings),
homeassistant_client: HomeAssistantClient = Depends(get_homeassistant_client),
) -> Response:
try:
publish_latest_poo_status(
session=db,
settings=settings,
homeassistant_client=homeassistant_client,
)
except Exception as exc:
logger.warning("Failed to publish latest poo status: %s", exc)
return PlainTextResponse(
INTERNAL_SERVER_ERROR_MESSAGE,
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
)
return Response(status_code=status.HTTP_200_OK)
+26
View File
@@ -0,0 +1,26 @@
from fastapi import APIRouter, Depends, HTTPException, status
from sqlalchemy.orm import Session
from app.dependencies import get_auth_db, get_current_auth_session
from app.schemas.public_ip import PublicIPCheckResponse
from app.config import get_settings
from app.services.auth import AuthenticatedSession
from app.services.public_ip import check_public_ipv4_and_notify
router = APIRouter(tags=["public-ip"])
@router.get("/public-ip/check", response_model=PublicIPCheckResponse)
def run_public_ip_check(
session: Session = Depends(get_auth_db),
current_auth: AuthenticatedSession | None = Depends(get_current_auth_session),
) -> PublicIPCheckResponse:
if current_auth is None:
raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED, detail="authentication required")
result = check_public_ipv4_and_notify(session, bootstrap_settings=get_settings())
return PublicIPCheckResponse(
status=result.status,
checked_at=result.checked_at,
changed=result.changed,
)
+11
View File
@@ -0,0 +1,11 @@
from fastapi import APIRouter
from app.schemas.health import StatusResponse
router = APIRouter(tags=["system"])
@router.get("/status", response_model=StatusResponse)
def get_status() -> StatusResponse:
return StatusResponse(status="ok")
+79
View File
@@ -0,0 +1,79 @@
import logging
from fastapi import APIRouter, Depends, Request, status
from fastapi.responses import PlainTextResponse, RedirectResponse, Response
from sqlalchemy.orm import Session
from app.config import Settings
from app.dependencies import (
get_app_settings,
get_auth_db,
get_current_auth_session,
get_ticktick_client,
)
from app.integrations.ticktick import TickTickAuthError, TickTickClient, TickTickConfigError, TickTickRequestError
from app.services.auth import AuthenticatedSession
from app.services.config_page import save_config_value
router = APIRouter(tags=["ticktick"])
logger = logging.getLogger(__name__)
@router.get("/ticktick/auth/start")
def start_ticktick_auth(
current_auth: AuthenticatedSession | None = Depends(get_current_auth_session),
ticktick_client: TickTickClient = Depends(get_ticktick_client),
) -> Response:
if current_auth is None:
return RedirectResponse(url="/login", status_code=status.HTTP_303_SEE_OTHER)
try:
authorization_url = ticktick_client.build_authorization_url()
except TickTickConfigError as exc:
logger.warning("Rejected TickTick OAuth start due to incomplete configuration: %s", exc)
return PlainTextResponse("TickTick integration is not configured", status_code=400)
return RedirectResponse(url=authorization_url, status_code=status.HTTP_303_SEE_OTHER)
@router.get("/ticktick/auth/code")
def handle_ticktick_auth_code(
request: Request,
auth_db_session: Session = Depends(get_auth_db),
settings: Settings = Depends(get_app_settings),
ticktick_client: TickTickClient = Depends(get_ticktick_client),
) -> Response:
code = request.query_params.get("code", "")
state = request.query_params.get("state", "")
if not code or not state:
return RedirectResponse(
url="/config?ticktick_oauth=invalid-callback",
status_code=status.HTTP_303_SEE_OTHER,
)
try:
token = ticktick_client.exchange_authorization_code(code=code, state=state)
save_config_value(
auth_db_session,
env_name="TICKTICK_TOKEN",
value=token,
bootstrap_settings=settings,
)
except TickTickAuthError as exc:
logger.warning("Rejected TickTick OAuth callback due to invalid state: %s", exc)
return RedirectResponse(
url="/config?ticktick_oauth=invalid-state",
status_code=status.HTTP_303_SEE_OTHER,
)
except (TickTickConfigError, TickTickRequestError, ValueError) as exc:
logger.warning("TickTick OAuth callback failed: %s", exc)
return RedirectResponse(
url="/config?ticktick_oauth=failed",
status_code=status.HTTP_303_SEE_OTHER,
)
return RedirectResponse(
url="/config?ticktick_oauth=success",
status_code=status.HTTP_303_SEE_OTHER,
)
+53
View File
@@ -0,0 +1,53 @@
from collections.abc import Generator
from functools import lru_cache
from sqlalchemy import create_engine
from sqlalchemy.orm import DeclarativeBase, Session, sessionmaker
from app.config import get_settings
class AuthBase(DeclarativeBase):
pass
def _build_connect_args(database_url: str) -> dict[str, object]:
connect_args: dict[str, object] = {}
if database_url.startswith("sqlite"):
connect_args["check_same_thread"] = False
return connect_args
@lru_cache
def _get_auth_engine(database_url: str):
return create_engine(database_url, connect_args=_build_connect_args(database_url))
@lru_cache
def _get_auth_session_local(database_url: str):
engine = _get_auth_engine(database_url)
return sessionmaker(bind=engine, autoflush=False, autocommit=False, class_=Session)
def get_auth_engine():
settings = get_settings()
return _get_auth_engine(settings.app_database_url)
def get_auth_session_local():
settings = get_settings()
return _get_auth_session_local(settings.app_database_url)
def reset_auth_db_caches() -> None:
_get_auth_session_local.cache_clear()
_get_auth_engine.cache_clear()
def get_auth_db_session() -> Generator[Session, None, None]:
session_local = get_auth_session_local()
session = session_local()
try:
yield session
finally:
session.close()
+105
View File
@@ -0,0 +1,105 @@
from functools import lru_cache
from pathlib import Path
from pydantic import computed_field
from pydantic_settings import BaseSettings, SettingsConfigDict
class Settings(BaseSettings):
app_name: str = "Home Automation Backend (Python)"
app_env: str = "production"
app_debug: bool = False
app_hostname: str = "localhost:8000"
app_database_url: str = "sqlite:///./data/app.db"
location_database_url: str = "sqlite:///./data/locationRecorder.db"
poo_database_url: str = "sqlite:///./data/pooRecorder.db"
ticktick_client_id: str = ""
ticktick_client_secret: str = ""
ticktick_token: str = ""
home_assistant_base_url: str = ""
home_assistant_auth_token: str = ""
home_assistant_timeout_seconds: float = 1.0
home_assistant_action_task_project_id: str = ""
smtp_enabled: bool = False
smtp_host: str = ""
smtp_port: int = 587
smtp_username: str = ""
smtp_password: str = ""
smtp_from_name: str = ""
smtp_from_address: str = ""
smtp_to_address: str = ""
smtp_use_starttls: bool = True
poo_webhook_id: str = ""
poo_sensor_entity_name: str = "sensor.test_poo_status"
poo_sensor_friendly_name: str = "Poo Status"
auth_bootstrap_username: str = "admin"
auth_bootstrap_password: str = "admin"
auth_session_cookie_name: str = "home_automation_session"
auth_session_ttl_hours: int = 12
auth_cookie_secure_override: bool | None = True
model_config = SettingsConfigDict(
env_file=".env",
env_file_encoding="utf-8",
case_sensitive=False,
extra="ignore",
)
@computed_field
@property
def is_development(self) -> bool:
return self.app_env.lower() == "development"
@computed_field
@property
def app_base_url(self) -> str:
hostname = self.app_hostname.strip().rstrip("/")
if not hostname:
return ""
scheme = "http" if self.is_development else "https"
return f"{scheme}://{hostname}"
@computed_field
@property
def ticktick_redirect_uri(self) -> str:
if not self.app_base_url:
return ""
return f"{self.app_base_url}/ticktick/auth/code"
@staticmethod
def _sqlite_path_from_url(database_url: str) -> Path | None:
prefix = "sqlite:///"
if not database_url.startswith(prefix):
return None
raw_path = database_url[len(prefix) :]
return Path(raw_path)
@computed_field
@property
def location_sqlite_path(self) -> Path | None:
return self._sqlite_path_from_url(self.location_database_url)
@computed_field
@property
def app_sqlite_path(self) -> Path | None:
return self._sqlite_path_from_url(self.app_database_url)
@computed_field
@property
def poo_sqlite_path(self) -> Path | None:
return self._sqlite_path_from_url(self.poo_database_url)
@computed_field
@property
def auth_cookie_secure(self) -> bool:
if self.auth_cookie_secure_override is not None:
return self.auth_cookie_secure_override
return not self.is_development
@lru_cache
def get_settings() -> Settings:
return Settings()
+28
View File
@@ -0,0 +1,28 @@
from collections.abc import Generator
from sqlalchemy import create_engine
from sqlalchemy.orm import DeclarativeBase, Session, sessionmaker
from app.config import get_settings
class Base(DeclarativeBase):
pass
settings = get_settings()
connect_args: dict[str, object] = {}
if settings.location_database_url.startswith("sqlite"):
connect_args["check_same_thread"] = False
engine = create_engine(settings.location_database_url, connect_args=connect_args)
SessionLocal = sessionmaker(bind=engine, autoflush=False, autocommit=False, class_=Session)
def get_db_session() -> Generator[Session, None, None]:
session = SessionLocal()
try:
yield session
finally:
session.close()
+46
View File
@@ -0,0 +1,46 @@
from collections.abc import Generator
from fastapi import Depends, Request
from sqlalchemy.orm import Session
from app.auth_db import get_auth_db_session
from app.config import Settings, get_settings
from app.db import get_db_session
from app.integrations.homeassistant import HomeAssistantClient
from app.integrations.ticktick import TickTickClient
from app.poo_db import get_poo_db_session
from app.services.auth import AuthenticatedSession, get_authenticated_session
from app.services.config_page import build_runtime_settings
def get_auth_db() -> Generator[Session, None, None]:
yield from get_auth_db_session()
def get_app_settings(session: Session = Depends(get_auth_db)) -> Settings:
return build_runtime_settings(session, get_settings())
def get_db() -> Generator[Session, None, None]:
yield from get_db_session()
def get_poo_db() -> Generator[Session, None, None]:
yield from get_poo_db_session()
def get_homeassistant_client(settings: Settings = Depends(get_app_settings)) -> HomeAssistantClient:
return HomeAssistantClient(settings)
def get_ticktick_client(settings: Settings = Depends(get_app_settings)) -> TickTickClient:
return TickTickClient(settings)
def get_current_auth_session(
request: Request,
session: Session = Depends(get_auth_db),
settings: Settings = Depends(get_app_settings),
) -> AuthenticatedSession | None:
raw_token = request.cookies.get(settings.auth_session_cookie_name)
return get_authenticated_session(session, raw_token=raw_token)
+2
View File
@@ -0,0 +1,2 @@
"""External integration placeholders for future migration."""
+108
View File
@@ -0,0 +1,108 @@
from __future__ import annotations
import json
import logging
from dataclasses import dataclass, field
from typing import Any
from urllib import error, parse, request
from app.config import Settings
logger = logging.getLogger(__name__)
SUCCESS_STATUS_CODES = {200, 201}
class HomeAssistantConfigError(RuntimeError):
"""Raised when required Home Assistant outbound configuration is missing."""
class HomeAssistantRequestError(RuntimeError):
"""Raised when a Home Assistant outbound HTTP request fails."""
@dataclass(slots=True)
class HomeAssistantClient:
settings: Settings
timeout_seconds: float | None = field(default=None)
def __post_init__(self) -> None:
if self.timeout_seconds is None:
self.timeout_seconds = self.settings.home_assistant_timeout_seconds
def is_configured(self) -> bool:
return bool(self.settings.home_assistant_base_url and self.settings.home_assistant_auth_token)
def publish_sensor(
self,
*,
entity_id: str,
state: str,
attributes: dict[str, Any] | None = None,
) -> None:
self._require_config()
if not entity_id:
raise ValueError("entity_id must not be empty")
payload = {
"entity_id": entity_id,
"state": state,
"attributes": attributes or {},
}
self._post_json(f"/api/states/{entity_id}", payload, operation="publish_sensor")
def trigger_webhook(self, *, webhook_id: str, body: Any) -> None:
self._require_config()
if not webhook_id:
raise ValueError("webhook_id must not be empty")
self._post_json(f"/api/webhook/{webhook_id}", body, operation="trigger_webhook")
def _require_config(self) -> None:
if self.is_configured():
return
raise HomeAssistantConfigError(
"Home Assistant outbound integration is not configured. "
"Set HOME_ASSISTANT_BASE_URL and HOME_ASSISTANT_AUTH_TOKEN."
)
def _post_json(self, path: str, payload: Any, *, operation: str) -> None:
url = self._build_url(path)
body = json.dumps(payload).encode("utf-8")
req = request.Request(url, data=body, method="POST")
req.add_header("Content-Type", "application/json")
req.add_header("Authorization", f"Bearer {self.settings.home_assistant_auth_token}")
try:
with request.urlopen(req, timeout=self.timeout_seconds) as response:
status_code = response.getcode()
except error.HTTPError as exc:
logger.warning(
"Home Assistant outbound %s failed with HTTP %s for %s",
operation,
exc.code,
url,
)
raise HomeAssistantRequestError(
f"Home Assistant outbound {operation} failed with HTTP {exc.code}"
) from exc
except error.URLError as exc:
logger.warning("Home Assistant outbound %s failed for %s: %s", operation, url, exc)
raise HomeAssistantRequestError(
f"Home Assistant outbound {operation} failed to reach Home Assistant"
) from exc
if status_code not in SUCCESS_STATUS_CODES:
logger.warning(
"Home Assistant outbound %s returned unexpected status %s for %s",
operation,
status_code,
url,
)
raise HomeAssistantRequestError(
f"Home Assistant outbound {operation} returned unexpected status {status_code}"
)
def _build_url(self, path: str) -> str:
base_url = self.settings.home_assistant_base_url.rstrip("/")
quoted_path = parse.quote(path.lstrip("/"), safe="/")
return f"{base_url}/{quoted_path}"
+301
View File
@@ -0,0 +1,301 @@
from __future__ import annotations
import json
import logging
import secrets
import base64
from dataclasses import asdict, dataclass, field, fields
from typing import Any
from urllib import error, parse, request
from app.config import Settings
logger = logging.getLogger(__name__)
TICKTICK_AUTH_URL = "https://ticktick.com/oauth/authorize"
TICKTICK_TOKEN_URL = "https://ticktick.com/oauth/token"
TICKTICK_OPEN_API_BASE_URL = "https://api.ticktick.com/open/v1"
TICKTICK_DATETIME_FORMAT = "%Y-%m-%dT%H:%M:%S%z"
AUTH_SCOPE = "tasks:read tasks:write"
class TickTickConfigError(RuntimeError):
"""Raised when TickTick is missing required runtime configuration."""
class TickTickAuthError(RuntimeError):
"""Raised when TickTick OAuth state validation fails."""
class TickTickRequestError(RuntimeError):
"""Raised when a TickTick API request fails."""
@dataclass(slots=True)
class TickTickProject:
id: str
name: str
color: str | None = None
sortOrder: int | None = None
closed: bool | None = None
groupId: str | None = None
viewMode: str | None = None
permission: str | None = None
kind: str | None = None
@dataclass(slots=True)
class TickTickTask:
projectId: str
title: str
id: str | None = None
isAllDay: bool | None = None
completedTime: str | None = None
content: str | None = None
desc: str | None = None
dueDate: str | None = None
items: list[Any] | None = None
priority: int | None = None
reminders: list[str] | None = None
repeatFlag: str | None = None
sortOrder: int | None = None
startDate: str | None = None
status: int | None = None
timeZone: str | None = None
@dataclass(slots=True)
class TickTickAuthStateStore:
pending_state: str | None = None
def issue_state(self) -> str:
self.pending_state = secrets.token_hex(6)
return self.pending_state
def matches_state(self, state: str) -> bool:
return bool(self.pending_state and state == self.pending_state)
def consume_state(self, state: str) -> bool:
if not self.pending_state or state != self.pending_state:
return False
self.pending_state = None
return True
def clear(self) -> None:
self.pending_state = None
default_auth_state_store = TickTickAuthStateStore()
def _coerce_dataclass_payload(model_type: type, payload: dict[str, Any]) -> Any:
allowed_field_names = {item.name for item in fields(model_type)}
filtered_payload = {
key: value for key, value in payload.items() if key in allowed_field_names
}
return model_type(**filtered_payload)
@dataclass(slots=True)
class TickTickClient:
settings: Settings
auth_state_store: TickTickAuthStateStore = field(default_factory=lambda: default_auth_state_store)
timeout_seconds: float = 10.0
def is_configured(self) -> bool:
return bool(self._client_id() and self._client_secret())
def has_token(self) -> bool:
return bool(self.settings.ticktick_token)
def build_authorization_url(self) -> str:
self._require_auth_config()
state = self.auth_state_store.issue_state()
params = parse.urlencode(
{
"client_id": self._client_id(),
"response_type": "code",
"redirect_uri": self._redirect_uri(),
"state": state,
"scope": AUTH_SCOPE,
}
)
return f"{TICKTICK_AUTH_URL}?{params}"
def exchange_authorization_code(self, *, code: str, state: str) -> str:
self._require_auth_config()
if not code:
raise ValueError("code must not be empty")
if not state:
raise ValueError("state must not be empty")
if not self.auth_state_store.matches_state(state):
raise TickTickAuthError("Invalid state")
body = parse.urlencode(
{
"code": code,
"grant_type": "authorization_code",
"scope": AUTH_SCOPE,
"redirect_uri": self._redirect_uri(),
}
).encode("utf-8")
req = request.Request(TICKTICK_TOKEN_URL, data=body, method="POST")
req.add_header("Content-Type", "application/x-www-form-urlencoded")
req.add_header("Authorization", self._basic_auth_header())
payload = self._send_json_request(req, operation="exchange_authorization_code")
self.auth_state_store.clear()
token = payload.get("access_token")
if not isinstance(token, str) or not token:
raise TickTickRequestError("TickTick token response did not include access_token")
return token
def get_projects(self) -> list[TickTickProject]:
self._require_token()
payload = self._authorized_json_request(
method="GET",
path="/project/",
operation="get_projects",
)
if not isinstance(payload, list):
raise TickTickRequestError("TickTick get_projects returned an unexpected payload")
return [_coerce_dataclass_payload(TickTickProject, project) for project in payload]
def get_tasks(self, project_id: str) -> list[TickTickTask]:
self._require_token()
if not project_id:
raise ValueError("project_id must not be empty")
payload = self._authorized_json_request(
method="GET",
path=f"/project/{parse.quote(project_id, safe='')}/data",
operation="get_tasks",
accepted_status_codes={200, 404},
)
if payload is None:
return []
if not isinstance(payload, dict):
raise TickTickRequestError("TickTick get_tasks returned an unexpected payload")
tasks = payload.get("tasks", [])
if not isinstance(tasks, list):
raise TickTickRequestError("TickTick get_tasks returned an invalid tasks payload")
return [_coerce_dataclass_payload(TickTickTask, task) for task in tasks]
def has_duplicate_task(self, *, project_id: str, task_title: str) -> bool:
if not task_title:
raise ValueError("task_title must not be empty")
return any(task.title == task_title for task in self.get_tasks(project_id))
def create_task(self, task: TickTickTask) -> None:
self._require_token()
if not task.projectId:
raise ValueError("task.projectId must not be empty")
if not task.title:
raise ValueError("task.title must not be empty")
if self.has_duplicate_task(project_id=task.projectId, task_title=task.title):
return
payload = {key: value for key, value in asdict(task).items() if value is not None}
self._authorized_json_request(
method="POST",
path="/task",
operation="create_task",
body=payload,
accepted_status_codes={200},
)
def _authorized_json_request(
self,
*,
method: str,
path: str,
operation: str,
body: Any | None = None,
accepted_status_codes: set[int] | None = None,
) -> Any:
url = f"{TICKTICK_OPEN_API_BASE_URL}{path}"
encoded_body = None if body is None else json.dumps(body).encode("utf-8")
req = request.Request(url, data=encoded_body, method=method)
req.add_header("Authorization", f"Bearer {self.settings.ticktick_token}")
if body is not None:
req.add_header("Content-Type", "application/json")
return self._send_json_request(
req,
operation=operation,
accepted_status_codes=accepted_status_codes,
)
def _send_json_request(
self,
req: request.Request,
*,
operation: str,
accepted_status_codes: set[int] | None = None,
) -> Any:
accepted_codes = accepted_status_codes or {200}
try:
with request.urlopen(req, timeout=self.timeout_seconds) as response:
status_code = response.getcode()
if status_code not in accepted_codes:
raise TickTickRequestError(
f"TickTick {operation} returned unexpected status {status_code}"
)
raw_body = response.read()
except error.HTTPError as exc:
if exc.code in accepted_codes:
raw_body = exc.read()
else:
logger.warning(
"TickTick %s failed with HTTP %s for %s",
operation,
exc.code,
req.full_url,
)
raise TickTickRequestError(
f"TickTick {operation} failed with HTTP {exc.code}"
) from exc
except error.URLError as exc:
logger.warning("TickTick %s failed for %s: %s", operation, req.full_url, exc)
raise TickTickRequestError(
f"TickTick {operation} failed to reach TickTick API"
) from exc
if not raw_body:
return None
try:
return json.loads(raw_body)
except json.JSONDecodeError as exc:
raise TickTickRequestError(
f"TickTick {operation} returned invalid JSON"
) from exc
def _basic_auth_header(self) -> str:
raw_credentials = f"{self._client_id()}:{self._client_secret()}"
token = base64.b64encode(raw_credentials.encode("utf-8")).decode("ascii")
return f"Basic {token}"
def _client_id(self) -> str:
return self.settings.ticktick_client_id.strip()
def _client_secret(self) -> str:
return self.settings.ticktick_client_secret.strip()
def _redirect_uri(self) -> str:
return self.settings.ticktick_redirect_uri
def _require_auth_config(self) -> None:
if not self.is_configured():
raise TickTickConfigError(
"TickTick integration is not configured. Set TICKTICK_CLIENT_ID and "
"TICKTICK_CLIENT_SECRET."
)
if not self._redirect_uri():
raise TickTickConfigError(
"TickTick integration is missing APP_HOSTNAME for OAuth callback generation."
)
def _require_token(self) -> None:
self._require_auth_config()
if self.has_token():
return
raise TickTickConfigError(
"TickTick integration is missing TICKTICK_TOKEN. Complete the OAuth flow first."
)
+129
View File
@@ -0,0 +1,129 @@
from contextlib import asynccontextmanager
from pathlib import Path
from fastapi import FastAPI
from fastapi.staticfiles import StaticFiles
from apscheduler.schedulers.background import BackgroundScheduler
from apscheduler.triggers.interval import IntervalTrigger
from sqlalchemy.orm import Session
from app import models # noqa: F401
from app.api.routes.auth import router as auth_router
from app.api.routes import pages, status
import app.auth_db as auth_db
from app.api.routes.homeassistant import router as homeassistant_router
from app.api.routes.location import router as location_router
from app.api.routes.poo import router as poo_router
from app.api.routes.public_ip import router as public_ip_router
from app.api.routes.ticktick import router as ticktick_router
from app.config import get_settings
from app.services.auth import AuthBootstrapError, initialize_auth_schema
from app.services.config_page import seed_missing_config_from_bootstrap, sync_app_hostname_from_bootstrap
from app.services.public_ip import check_public_ipv4_and_notify
from scripts.app_db_adopt import AppDatabaseAdoptionError, validate_app_runtime_db
from scripts.location_db_adopt import LocationDatabaseAdoptionError, validate_location_runtime_db
from scripts.poo_db_adopt import PooDatabaseAdoptionError, validate_poo_runtime_db
def _run_scheduled_public_ip_check() -> None:
session_local = auth_db.get_auth_session_local()
session: Session = session_local()
try:
check_public_ipv4_and_notify(session, bootstrap_settings=get_settings())
finally:
session.close()
def ensure_auth_db_ready() -> None:
session_local = auth_db.get_auth_session_local()
session: Session = session_local()
try:
validate_app_runtime_db(get_settings().app_database_url)
initialize_auth_schema(session, get_settings())
seed_missing_config_from_bootstrap(session, get_settings())
sync_app_hostname_from_bootstrap(session, get_settings())
except AppDatabaseAdoptionError as exc:
raise RuntimeError(str(exc)) from exc
except AuthBootstrapError as exc:
raise RuntimeError(str(exc)) from exc
finally:
session.close()
def ensure_location_db_ready() -> None:
settings = get_settings()
if settings.location_sqlite_path is None:
return
try:
validate_location_runtime_db(settings.location_database_url)
except LocationDatabaseAdoptionError as exc:
raise RuntimeError(str(exc)) from exc
def ensure_poo_db_ready() -> None:
settings = get_settings()
if settings.poo_sqlite_path is None:
return
try:
validate_poo_runtime_db(settings.poo_database_url)
except PooDatabaseAdoptionError as exc:
raise RuntimeError(str(exc)) from exc
def ensure_runtime_dirs() -> None:
settings = get_settings()
for path in (settings.app_sqlite_path, settings.location_sqlite_path, settings.poo_sqlite_path):
if path is not None:
path.parent.mkdir(parents=True, exist_ok=True)
@asynccontextmanager
async def lifespan(_: FastAPI):
ensure_runtime_dirs()
ensure_auth_db_ready()
ensure_location_db_ready()
ensure_poo_db_ready()
scheduler = BackgroundScheduler(timezone="UTC")
scheduler.add_job(
_run_scheduled_public_ip_check,
trigger=IntervalTrigger(hours=4),
id="public-ip-check",
replace_existing=True,
max_instances=1,
coalesce=True,
)
scheduler.start()
yield
scheduler.shutdown(wait=False)
def create_app() -> FastAPI:
settings = get_settings()
app = FastAPI(
title=settings.app_name,
debug=settings.app_debug,
version="0.1.0",
lifespan=lifespan,
description=(
"Home automation backend with auth, runtime config, Home Assistant "
"integrations, TickTick integration, and SQLite-backed recorders."
),
)
static_dir = Path(__file__).parent / "static"
app.mount("/static", StaticFiles(directory=static_dir), name="static")
app.include_router(status.router)
app.include_router(auth_router)
app.include_router(pages.router)
app.include_router(homeassistant_router)
app.include_router(location_router)
app.include_router(poo_router)
app.include_router(public_ip_router)
app.include_router(ticktick_router)
return app
app = create_app()
+15
View File
@@ -0,0 +1,15 @@
"""SQLAlchemy models package."""
from app.models.auth import AuthSession, AuthUser
from app.models.config import AppConfigEntry
from app.models.location import Location
from app.models.public_ip import PublicIPHistory, PublicIPState
__all__ = [
"AppConfigEntry",
"AuthSession",
"AuthUser",
"Location",
"PublicIPHistory",
"PublicIPState",
]
+33
View File
@@ -0,0 +1,33 @@
from datetime import datetime
from sqlalchemy import Boolean, DateTime, ForeignKey, Integer, String
from sqlalchemy.orm import Mapped, mapped_column, relationship
from app.auth_db import AuthBase
class AuthUser(AuthBase):
__tablename__ = "auth_users"
id: Mapped[int] = mapped_column(Integer, primary_key=True, autoincrement=True)
username: Mapped[str] = mapped_column(String(255), unique=True, nullable=False, index=True)
password_hash: Mapped[str] = mapped_column(String(255), nullable=False)
is_active: Mapped[bool] = mapped_column(Boolean, nullable=False, default=True)
force_password_change: Mapped[bool] = mapped_column(Boolean, nullable=False, default=True)
created_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), nullable=False)
sessions: Mapped[list["AuthSession"]] = relationship(back_populates="user")
class AuthSession(AuthBase):
__tablename__ = "auth_sessions"
id: Mapped[int] = mapped_column(Integer, primary_key=True, autoincrement=True)
user_id: Mapped[int] = mapped_column(ForeignKey("auth_users.id"), nullable=False, index=True)
token_hash: Mapped[str] = mapped_column(String(64), unique=True, nullable=False, index=True)
csrf_token: Mapped[str] = mapped_column(String(128), nullable=False)
created_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), nullable=False)
expires_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), nullable=False, index=True)
revoked_at: Mapped[datetime | None] = mapped_column(DateTime(timezone=True), nullable=True)
user: Mapped[AuthUser] = relationship(back_populates="sessions")
+4
View File
@@ -0,0 +1,4 @@
from app.db import Base
__all__ = ["Base"]
+15
View File
@@ -0,0 +1,15 @@
from datetime import datetime
from sqlalchemy import DateTime, Integer, String
from sqlalchemy.orm import Mapped, mapped_column
from app.auth_db import AuthBase
class AppConfigEntry(AuthBase):
__tablename__ = "app_config"
id: Mapped[int] = mapped_column(Integer, primary_key=True, autoincrement=True)
key: Mapped[str] = mapped_column(String(255), unique=True, nullable=False, index=True)
value: Mapped[str] = mapped_column(String, nullable=False)
updated_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), nullable=False)
+15
View File
@@ -0,0 +1,15 @@
from sqlalchemy import Float, String
from sqlalchemy.orm import Mapped, mapped_column
from app.db import Base
class Location(Base):
__tablename__ = "location"
person: Mapped[str] = mapped_column(String, primary_key=True)
datetime: Mapped[str] = mapped_column(String, primary_key=True)
latitude: Mapped[float] = mapped_column(Float, nullable=False)
longitude: Mapped[float] = mapped_column(Float, nullable=False)
altitude: Mapped[float | None] = mapped_column(Float, nullable=True)
+13
View File
@@ -0,0 +1,13 @@
from sqlalchemy import Float, String
from sqlalchemy.orm import Mapped, mapped_column
from app.poo_db import PooBase
class PooRecord(PooBase):
__tablename__ = "poo_records"
timestamp: Mapped[str] = mapped_column(String, primary_key=True)
status: Mapped[str] = mapped_column(String, nullable=False)
latitude: Mapped[float] = mapped_column(Float, nullable=False)
longitude: Mapped[float] = mapped_column(Float, nullable=False)
+30
View File
@@ -0,0 +1,30 @@
from datetime import datetime
from sqlalchemy import DateTime, Integer, String
from sqlalchemy.orm import Mapped, mapped_column
from app.auth_db import AuthBase
class PublicIPState(AuthBase):
__tablename__ = "public_ip_state"
id: Mapped[int] = mapped_column(Integer, primary_key=True, autoincrement=True)
current_ipv4: Mapped[str] = mapped_column(String(45), nullable=False)
previous_ipv4: Mapped[str | None] = mapped_column(String(45), nullable=True)
first_seen_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), nullable=False)
last_checked_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), nullable=False)
last_changed_at: Mapped[datetime | None] = mapped_column(DateTime(timezone=True), nullable=True)
last_check_status: Mapped[str] = mapped_column(String(32), nullable=False)
last_check_error: Mapped[str | None] = mapped_column(String(255), nullable=True)
last_provider: Mapped[str | None] = mapped_column(String(64), nullable=True)
class PublicIPHistory(AuthBase):
__tablename__ = "public_ip_history"
id: Mapped[int] = mapped_column(Integer, primary_key=True, autoincrement=True)
ipv4: Mapped[str] = mapped_column(String(45), nullable=False)
observed_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), nullable=False)
change_type: Mapped[str] = mapped_column(String(32), nullable=False)
provider: Mapped[str | None] = mapped_column(String(64), nullable=True)
+28
View File
@@ -0,0 +1,28 @@
from collections.abc import Generator
from sqlalchemy import create_engine
from sqlalchemy.orm import DeclarativeBase, Session, sessionmaker
from app.config import get_settings
class PooBase(DeclarativeBase):
pass
settings = get_settings()
connect_args: dict[str, object] = {}
if settings.poo_database_url.startswith("sqlite"):
connect_args["check_same_thread"] = False
poo_engine = create_engine(settings.poo_database_url, connect_args=connect_args)
PooSessionLocal = sessionmaker(bind=poo_engine, autoflush=False, autocommit=False, class_=Session)
def get_poo_db_session() -> Generator[Session, None, None]:
session = PooSessionLocal()
try:
yield session
finally:
session.close()
+2
View File
@@ -0,0 +1,2 @@
"""Pydantic schemas package."""
+6
View File
@@ -0,0 +1,6 @@
from pydantic import BaseModel
class StatusResponse(BaseModel):
status: str
+9
View File
@@ -0,0 +1,9 @@
from pydantic import BaseModel, ConfigDict
class HomeAssistantPublishEnvelope(BaseModel):
target: str
action: str
content: str
model_config = ConfigDict(extra="forbid")
+10
View File
@@ -0,0 +1,10 @@
from pydantic import BaseModel, ConfigDict
class LocationRecordRequest(BaseModel):
person: str
latitude: str
longitude: str
altitude: str | None = None
model_config = ConfigDict(extra="forbid")
+9
View File
@@ -0,0 +1,9 @@
from pydantic import BaseModel, ConfigDict
class PooRecordRequest(BaseModel):
status: str
latitude: str
longitude: str
model_config = ConfigDict(extra="forbid")
+13
View File
@@ -0,0 +1,13 @@
from datetime import datetime
from typing import Literal
from pydantic import BaseModel
PublicIPCheckStatus = Literal["first_seen", "unchanged", "changed", "error"]
class PublicIPCheckResponse(BaseModel):
status: PublicIPCheckStatus
checked_at: datetime
changed: bool
+9
View File
@@ -0,0 +1,9 @@
from pydantic import BaseModel, ConfigDict, Field
class TickTickActionTaskRequest(BaseModel):
title: str | None = None
action: str
due_hour: int = Field(alias="due_hour")
model_config = ConfigDict(extra="forbid", populate_by_name=True)
+2
View File
@@ -0,0 +1,2 @@
"""Service layer package."""
+192
View File
@@ -0,0 +1,192 @@
from __future__ import annotations
import hashlib
import logging
import secrets
from dataclasses import dataclass
from datetime import UTC, datetime, timedelta
from argon2 import PasswordHasher
from argon2.exceptions import InvalidHashError, VerificationError, VerifyMismatchError
from sqlalchemy import Select, select
from sqlalchemy.orm import Session
from app.config import Settings
from app.models.auth import AuthSession, AuthUser
logger = logging.getLogger(__name__)
password_hasher = PasswordHasher()
class AuthBootstrapError(RuntimeError):
"""Raised when the auth system cannot be safely initialized."""
class AuthPasswordChangeError(ValueError):
"""Raised when a password change request is invalid."""
@dataclass(slots=True)
class AuthenticatedSession:
user: AuthUser
session: AuthSession
def initialize_auth_schema(session: Session, settings: Settings) -> None:
has_any_user = session.scalar(select(AuthUser.id).limit(1)) is not None
if has_any_user:
return
if not settings.auth_bootstrap_username or not settings.auth_bootstrap_password:
raise AuthBootstrapError(
"Auth DB has no users. Set AUTH_BOOTSTRAP_USERNAME and "
"AUTH_BOOTSTRAP_PASSWORD before starting the app."
)
bootstrap_user = AuthUser(
username=settings.auth_bootstrap_username,
password_hash=hash_password(settings.auth_bootstrap_password),
is_active=True,
force_password_change=True,
created_at=_utc_now(),
)
session.add(bootstrap_user)
session.commit()
logger.warning(
"Bootstrapped initial auth user '%s'. Rotate AUTH_BOOTSTRAP_PASSWORD after first setup.",
bootstrap_user.username,
)
def hash_password(password: str) -> str:
return password_hasher.hash(password)
def verify_password(password: str, stored_hash: str) -> bool:
try:
return password_hasher.verify(stored_hash, password)
except VerifyMismatchError:
return False
except (InvalidHashError, VerificationError):
return False
def authenticate_user(session: Session, *, username: str, password: str) -> AuthUser | None:
user = session.scalar(select(AuthUser).where(AuthUser.username == username).limit(1))
if user is None or not user.is_active:
logger.info("Failed login for unknown or inactive user '%s'", username)
return None
if not verify_password(password, user.password_hash):
logger.info("Failed login due to invalid password for user '%s'", username)
return None
return user
def create_session(session: Session, *, user: AuthUser, settings: Settings) -> tuple[AuthSession, str]:
raw_token = secrets.token_urlsafe(32)
auth_session = AuthSession(
user_id=user.id,
token_hash=_hash_token(raw_token),
csrf_token=secrets.token_urlsafe(24),
created_at=_utc_now(),
expires_at=_utc_now() + timedelta(hours=settings.auth_session_ttl_hours),
revoked_at=None,
)
session.add(auth_session)
session.commit()
session.refresh(auth_session)
return auth_session, raw_token
def get_authenticated_session(session: Session, *, raw_token: str | None) -> AuthenticatedSession | None:
if not raw_token:
return None
stmt: Select[tuple[AuthSession, AuthUser]] = (
select(AuthSession, AuthUser)
.join(AuthUser, AuthSession.user_id == AuthUser.id)
.where(AuthSession.token_hash == _hash_token(raw_token))
.limit(1)
)
result = session.execute(stmt).first()
if result is None:
return None
auth_session, user = result
now = _utc_now()
expires_at = _as_utc(auth_session.expires_at)
revoked_at = _as_utc(auth_session.revoked_at)
if expires_at is None:
logger.warning("Auth session %s has no expires_at; treating it as invalid", auth_session.id)
return None
if revoked_at is not None or expires_at <= now or not user.is_active:
if revoked_at is None and expires_at <= now:
auth_session.revoked_at = now
session.commit()
return None
return AuthenticatedSession(user=user, session=auth_session)
def revoke_session(session: Session, *, auth_session: AuthSession) -> None:
if auth_session.revoked_at is not None:
return
auth_session.revoked_at = _utc_now()
session.commit()
def change_password(
session: Session,
*,
user: AuthUser,
current_password: str,
new_password: str,
confirm_password: str,
) -> None:
if not verify_password(current_password, user.password_hash):
raise AuthPasswordChangeError("current password is invalid")
if not new_password:
raise AuthPasswordChangeError("new password must not be empty")
if new_password != confirm_password:
raise AuthPasswordChangeError("new password confirmation does not match")
if len(new_password) < 8:
raise AuthPasswordChangeError("new password must be at least 8 characters long")
if verify_password(new_password, user.password_hash):
raise AuthPasswordChangeError("new password must be different from the current password")
user.password_hash = hash_password(new_password)
user.force_password_change = False
session.commit()
def issue_login_csrf_token() -> str:
return secrets.token_urlsafe(24)
def validate_csrf_token(*, expected: str | None, actual: str | None) -> bool:
if not expected or not actual:
return False
return secrets.compare_digest(expected, actual)
def _hash_token(raw_token: str) -> str:
return hashlib.sha256(raw_token.encode("utf-8")).hexdigest()
def _utc_now() -> datetime:
return datetime.now(UTC)
def _as_utc(value: datetime | None) -> datetime | None:
if value is None:
return None
if value.tzinfo is None:
return value.replace(tzinfo=UTC)
return value.astimezone(UTC)
+289
View File
@@ -0,0 +1,289 @@
from __future__ import annotations
from dataclasses import dataclass
from datetime import UTC, datetime
from typing import Any
from sqlalchemy import select
from sqlalchemy.orm import Session
from app.auth_db import reset_auth_db_caches
from app.config import Settings, get_settings
from app.models.config import AppConfigEntry
@dataclass(frozen=True, slots=True)
class ConfigField:
section: str
env_name: str
setting_attr: str
label: str
secret: bool = False
input_type: str = "text"
CONFIG_FIELDS: tuple[ConfigField, ...] = (
ConfigField("System", "APP_NAME", "app_name", "App Name"),
ConfigField("System", "APP_ENV", "app_env", "App Env"),
ConfigField("System", "APP_DEBUG", "app_debug", "App Debug"),
ConfigField("System", "APP_HOSTNAME", "app_hostname", "App Hostname"),
ConfigField("SMTP", "SMTP_ENABLED", "smtp_enabled", "SMTP Enabled"),
ConfigField("SMTP", "SMTP_HOST", "smtp_host", "SMTP Host"),
ConfigField("SMTP", "SMTP_PORT", "smtp_port", "SMTP Port"),
ConfigField("SMTP", "SMTP_USERNAME", "smtp_username", "SMTP Username"),
ConfigField("SMTP", "SMTP_PASSWORD", "smtp_password", "SMTP Password", secret=True),
ConfigField("SMTP", "SMTP_FROM_NAME", "smtp_from_name", "SMTP From Name"),
ConfigField("SMTP", "SMTP_FROM_ADDRESS", "smtp_from_address", "SMTP From Address"),
ConfigField("SMTP", "SMTP_TO_ADDRESS", "smtp_to_address", "SMTP To Address"),
ConfigField("SMTP", "SMTP_USE_STARTTLS", "smtp_use_starttls", "SMTP Use STARTTLS"),
ConfigField(
"Authentication",
"AUTH_SESSION_COOKIE_NAME",
"auth_session_cookie_name",
"Session Cookie Name",
),
ConfigField("Authentication", "AUTH_SESSION_TTL_HOURS", "auth_session_ttl_hours", "Session TTL Hours"),
ConfigField(
"Authentication",
"AUTH_COOKIE_SECURE_OVERRIDE",
"auth_cookie_secure_override",
"Cookie Secure Override",
),
ConfigField("Poo", "POO_WEBHOOK_ID", "poo_webhook_id", "Poo Webhook ID", secret=True),
ConfigField(
"Poo",
"POO_SENSOR_ENTITY_NAME",
"poo_sensor_entity_name",
"Poo Sensor Entity Name",
),
ConfigField(
"Poo",
"POO_SENSOR_FRIENDLY_NAME",
"poo_sensor_friendly_name",
"Poo Sensor Friendly Name",
),
ConfigField("TickTick", "TICKTICK_CLIENT_ID", "ticktick_client_id", "TickTick Client ID"),
ConfigField(
"TickTick",
"TICKTICK_CLIENT_SECRET",
"ticktick_client_secret",
"TickTick Client Secret",
secret=True,
),
ConfigField("TickTick", "TICKTICK_TOKEN", "ticktick_token", "TickTick Token", secret=True),
ConfigField(
"Home Assistant",
"HOME_ASSISTANT_BASE_URL",
"home_assistant_base_url",
"Home Assistant Base URL",
),
ConfigField(
"Home Assistant",
"HOME_ASSISTANT_AUTH_TOKEN",
"home_assistant_auth_token",
"Home Assistant Auth Token",
secret=True,
),
ConfigField(
"Home Assistant",
"HOME_ASSISTANT_TIMEOUT_SECONDS",
"home_assistant_timeout_seconds",
"Home Assistant Timeout Seconds",
),
ConfigField(
"Home Assistant",
"HOME_ASSISTANT_ACTION_TASK_PROJECT_ID",
"home_assistant_action_task_project_id",
"Home Assistant Action Task Project ID",
),
)
class ConfigSaveError(ValueError):
"""Raised when the submitted config payload is invalid."""
def seed_missing_config_from_bootstrap(session: Session, bootstrap_settings: Settings) -> None:
current_values = _read_config_values(session)
missing_values: dict[str, str] = {}
for field in CONFIG_FIELDS:
if field.env_name in current_values:
continue
missing_values[field.env_name] = _stringify(getattr(bootstrap_settings, field.setting_attr))
if not missing_values:
return
_persist_config_values(session, {**current_values, **missing_values})
def sync_app_hostname_from_bootstrap(session: Session, bootstrap_settings: Settings) -> None:
current_values = _read_config_values(session)
bootstrap_hostname = _stringify(bootstrap_settings.app_hostname)
if current_values.get("APP_HOSTNAME") == bootstrap_hostname:
return
current_values["APP_HOSTNAME"] = bootstrap_hostname
_persist_config_values(session, current_values)
get_settings.cache_clear()
reset_auth_db_caches()
def build_runtime_settings(session: Session, bootstrap_settings: Settings) -> Settings:
overrides = _read_config_values(session)
if not overrides:
return bootstrap_settings
payload = _settings_payload(bootstrap_settings)
for field in CONFIG_FIELDS:
if field.env_name in overrides:
payload[field.setting_attr] = overrides[field.env_name]
return Settings(_env_file=None, **payload)
def build_config_sections(session: Session, bootstrap_settings: Settings) -> list[dict[str, Any]]:
runtime_settings = build_runtime_settings(session, bootstrap_settings)
persisted_values = _read_config_values(session)
sections: list[dict[str, Any]] = []
current_section: dict[str, Any] | None = None
for field in CONFIG_FIELDS:
if current_section is None or current_section["name"] != field.section:
current_section = {"name": field.section, "fields": []}
sections.append(current_section)
current_section["fields"].append(
{
"env_name": field.env_name,
"label": field.label,
"value": "" if field.secret else _stringify(getattr(runtime_settings, field.setting_attr)),
"secret": field.secret,
"input_type": "password" if field.secret else field.input_type,
"configured": field.env_name in persisted_values
or bool(_stringify(getattr(bootstrap_settings, field.setting_attr))),
}
)
return sections
def save_config_updates(session: Session, form_data: dict[str, str], bootstrap_settings: Settings) -> None:
current_values = _read_config_values(session)
merged_values = dict(current_values)
for field in CONFIG_FIELDS:
submitted_value = form_data.get(field.env_name, "")
if field.secret:
if submitted_value:
merged_values[field.env_name] = submitted_value
else:
merged_values[field.env_name] = submitted_value
_validate_config_values(merged_values, bootstrap_settings)
_persist_config_values(session, merged_values)
get_settings.cache_clear()
reset_auth_db_caches()
def save_config_value(
session: Session,
*,
env_name: str,
value: str,
bootstrap_settings: Settings,
) -> None:
current_values = _read_config_values(session)
current_values[env_name] = value
_validate_config_values(current_values, bootstrap_settings)
_persist_config_values(session, current_values)
get_settings.cache_clear()
reset_auth_db_caches()
def is_ticktick_oauth_ready(settings: Settings) -> bool:
return bool(
settings.app_hostname
and settings.ticktick_client_id
and settings.ticktick_client_secret
)
def _read_config_values(session: Session) -> dict[str, str]:
rows = session.execute(select(AppConfigEntry).order_by(AppConfigEntry.key)).scalars().all()
return {row.key: row.value for row in rows}
def _validate_config_values(config_values: dict[str, str], bootstrap_settings: Settings) -> None:
payload = _settings_payload(bootstrap_settings)
for field in CONFIG_FIELDS:
if field.env_name in config_values:
payload[field.setting_attr] = config_values[field.env_name]
try:
Settings(_env_file=None, **payload)
except Exception as exc:
raise ConfigSaveError("invalid config submission") from exc
def _persist_config_values(session: Session, config_values: dict[str, str]) -> None:
existing_entries = {
row.key: row
for row in session.execute(select(AppConfigEntry)).scalars().all()
}
now = datetime.now(UTC)
for env_name, value in config_values.items():
entry = existing_entries.get(env_name)
if entry is None:
session.add(AppConfigEntry(key=env_name, value=value, updated_at=now))
else:
entry.value = value
entry.updated_at = now
session.commit()
def _stringify(value: Any) -> str:
if value is None:
return ""
if isinstance(value, bool):
return str(value).lower()
return str(value)
def _settings_payload(settings: Settings) -> dict[str, Any]:
return {
"app_name": settings.app_name,
"app_env": settings.app_env,
"app_debug": settings.app_debug,
"app_hostname": settings.app_hostname,
"app_database_url": settings.app_database_url,
"location_database_url": settings.location_database_url,
"poo_database_url": settings.poo_database_url,
"ticktick_client_id": settings.ticktick_client_id,
"ticktick_client_secret": settings.ticktick_client_secret,
"ticktick_token": settings.ticktick_token,
"home_assistant_base_url": settings.home_assistant_base_url,
"home_assistant_auth_token": settings.home_assistant_auth_token,
"home_assistant_timeout_seconds": settings.home_assistant_timeout_seconds,
"home_assistant_action_task_project_id": settings.home_assistant_action_task_project_id,
"smtp_enabled": settings.smtp_enabled,
"smtp_host": settings.smtp_host,
"smtp_port": settings.smtp_port,
"smtp_username": settings.smtp_username,
"smtp_password": settings.smtp_password,
"smtp_from_name": settings.smtp_from_name,
"smtp_from_address": settings.smtp_from_address,
"smtp_to_address": settings.smtp_to_address,
"smtp_use_starttls": settings.smtp_use_starttls,
"poo_webhook_id": settings.poo_webhook_id,
"poo_sensor_entity_name": settings.poo_sensor_entity_name,
"poo_sensor_friendly_name": settings.poo_sensor_friendly_name,
"auth_bootstrap_username": settings.auth_bootstrap_username,
"auth_bootstrap_password": settings.auth_bootstrap_password,
"auth_session_cookie_name": settings.auth_session_cookie_name,
"auth_session_ttl_hours": settings.auth_session_ttl_hours,
"auth_cookie_secure_override": settings.auth_cookie_secure_override,
}
+149
View File
@@ -0,0 +1,149 @@
from __future__ import annotations
from dataclasses import dataclass
from datetime import UTC, datetime
from email.message import EmailMessage
from email.utils import formataddr
import smtplib
from app.config import Settings
class EmailConfigurationError(ValueError):
"""Raised when SMTP settings are incomplete or disabled."""
class EmailDeliveryError(RuntimeError):
"""Raised when sending email fails."""
@dataclass(frozen=True, slots=True)
class SMTPConfig:
host: str
port: int
username: str
password: str
from_name: str
from_address: str
to_address: str
use_starttls: bool
def get_smtp_config(settings: Settings, *, require_enabled: bool = True) -> SMTPConfig:
if require_enabled and not settings.smtp_enabled:
raise EmailConfigurationError("SMTP is disabled")
if not settings.smtp_host:
raise EmailConfigurationError("SMTP host is required")
if settings.smtp_port <= 0:
raise EmailConfigurationError("SMTP port must be greater than zero")
if not settings.smtp_from_address:
raise EmailConfigurationError("SMTP from address is required")
if not settings.smtp_to_address:
raise EmailConfigurationError("SMTP to address is required")
return SMTPConfig(
host=settings.smtp_host,
port=settings.smtp_port,
username=settings.smtp_username,
password=settings.smtp_password,
from_name=settings.smtp_from_name,
from_address=settings.smtp_from_address,
to_address=settings.smtp_to_address,
use_starttls=settings.smtp_use_starttls,
)
def is_smtp_ready(settings: Settings) -> bool:
try:
get_smtp_config(settings, require_enabled=False)
except EmailConfigurationError:
return False
return True
def send_plaintext_email(
settings: Settings,
*,
subject: str,
body: str,
recipient: str | None = None,
require_enabled: bool = True,
) -> None:
smtp_config = get_smtp_config(settings, require_enabled=require_enabled)
message = EmailMessage()
message["Subject"] = subject
message["From"] = _build_from_header(smtp_config)
message["To"] = recipient or smtp_config.to_address
message.set_content(body)
try:
with smtplib.SMTP(smtp_config.host, smtp_config.port, timeout=10) as smtp:
smtp.ehlo()
if smtp_config.use_starttls:
smtp.starttls()
smtp.ehlo()
if smtp_config.username:
smtp.login(smtp_config.username, smtp_config.password)
smtp.send_message(
message,
from_addr=smtp_config.from_address,
to_addrs=[recipient or smtp_config.to_address],
)
except (OSError, smtplib.SMTPException) as exc:
error_message = _sanitize_error_message(str(exc), smtp_config.password)
raise EmailDeliveryError(error_message or "SMTP delivery failed") from exc
def send_smtp_test_email(settings: Settings) -> None:
send_plaintext_email(
settings,
subject="Home Automation SMTP Test",
body="This is a test email from Home Automation SMTP settings.",
require_enabled=False,
)
def send_public_ip_changed_email(
settings: Settings,
*,
previous_ipv4: str,
current_ipv4: str,
detected_at: datetime,
) -> None:
send_plaintext_email(
settings,
subject="Public IP changed",
body=(
"Your public IPv4 address has changed.\n\n"
f"Previous IP: {previous_ipv4}\n"
f"Current IP: {current_ipv4}\n"
f"Detected at: {_format_utc_timestamp(detected_at)}\n\n"
"If you use Namecheap API trusted IP restrictions, you may need to "
"update the trusted IP manually.\n"
),
)
def _sanitize_error_message(message: str, password: str) -> str:
sanitized = message
if password:
sanitized = sanitized.replace(password, "[redacted]")
return sanitized
def _format_utc_timestamp(value: datetime) -> str:
if value.tzinfo is None:
normalized = value.replace(tzinfo=UTC)
else:
normalized = value.astimezone(UTC)
return normalized.strftime("%Y-%m-%d %H:%M:%S UTC")
def _build_from_header(smtp_config: SMTPConfig) -> str:
if smtp_config.from_name:
return formataddr((smtp_config.from_name, smtp_config.from_address))
return smtp_config.from_address
+116
View File
@@ -0,0 +1,116 @@
from __future__ import annotations
import json
from datetime import UTC, datetime, time, timedelta
from sqlalchemy.orm import Session
from app.config import Settings
from app.integrations.homeassistant import HomeAssistantClient
from app.integrations.ticktick import TICKTICK_DATETIME_FORMAT, TickTickClient, TickTickTask
from app.schemas.homeassistant import HomeAssistantPublishEnvelope
from app.schemas.location import LocationRecordRequest
from app.schemas.ticktick import TickTickActionTaskRequest
from app.services.location import record_location
from app.services.poo import publish_latest_poo_status
class UnsupportedHomeAssistantMessage(RuntimeError):
"""Raised when the inbound gateway receives a target/action that is not supported yet."""
def handle_homeassistant_message(
session: Session,
envelope: HomeAssistantPublishEnvelope,
ticktick_client: TickTickClient | None = None,
poo_session: Session | None = None,
settings: Settings | None = None,
homeassistant_client: HomeAssistantClient | None = None,
) -> None:
if envelope.target == "location_recorder":
_handle_location_message(session, envelope)
return
if envelope.target == "poo_recorder":
_handle_poo_message(
envelope,
poo_session=poo_session,
settings=settings,
homeassistant_client=homeassistant_client,
)
return
if envelope.target == "ticktick":
_handle_ticktick_message(envelope, ticktick_client)
return
raise UnsupportedHomeAssistantMessage(
f"Unsupported Home Assistant target/action: {envelope.target}/{envelope.action}"
)
def _handle_location_message(session: Session, envelope: HomeAssistantPublishEnvelope) -> None:
if envelope.action != "record":
raise UnsupportedHomeAssistantMessage(
f"Unsupported Home Assistant target/action: {envelope.target}/{envelope.action}"
)
content = json.loads(envelope.content.replace("'", '"'))
payload = LocationRecordRequest.model_validate(content)
record_location(session, payload)
def _handle_poo_message(
envelope: HomeAssistantPublishEnvelope,
*,
poo_session: Session | None,
settings: Settings | None,
homeassistant_client: HomeAssistantClient | None,
) -> None:
if envelope.action != "get_latest":
raise UnsupportedHomeAssistantMessage(
f"Unsupported Home Assistant target/action: {envelope.target}/{envelope.action}"
)
if poo_session is None or settings is None or homeassistant_client is None:
raise RuntimeError("Poo recorder integration is unavailable")
publish_latest_poo_status(
session=poo_session,
settings=settings,
homeassistant_client=homeassistant_client,
)
def _handle_ticktick_message(
envelope: HomeAssistantPublishEnvelope,
ticktick_client: TickTickClient | None,
) -> None:
if envelope.action != "create_action_task":
raise UnsupportedHomeAssistantMessage(
f"Unsupported Home Assistant target/action: {envelope.target}/{envelope.action}"
)
if ticktick_client is None:
raise UnsupportedHomeAssistantMessage("TickTick client is unavailable")
content = json.loads(envelope.content.replace("'", '"'))
payload = TickTickActionTaskRequest.model_validate(content)
project_id = ticktick_client.settings.home_assistant_action_task_project_id
if not project_id:
raise RuntimeError(
"TickTick action task integration is missing HOME_ASSISTANT_ACTION_TASK_PROJECT_ID"
)
ticktick_client.create_task(
TickTickTask(
projectId=project_id,
title=payload.action,
dueDate=build_action_task_due_date(datetime.now().astimezone(), payload.due_hour),
)
)
def build_action_task_due_date(now: datetime, due_hour: int) -> str:
local_now = now.astimezone()
due = local_now + timedelta(hours=due_hour)
next_midnight = datetime.combine(due.date(), time.min, tzinfo=local_now.tzinfo) + timedelta(days=1)
return next_midnight.astimezone(UTC).strftime(TICKTICK_DATETIME_FORMAT)
+42
View File
@@ -0,0 +1,42 @@
from datetime import datetime, timezone
from sqlalchemy import insert
from sqlalchemy.orm import Session
from app.models.location import Location
from app.schemas.location import LocationRecordRequest
def _parse_optional_float_compat(value: str | None) -> float:
try:
return float(value)
except (TypeError, ValueError):
return 0.0
def _parse_required_float(value: str, field_name: str) -> float:
try:
return float(value)
except (TypeError, ValueError) as exc:
raise ValueError(f"Invalid numeric value for {field_name}") from exc
def _utc_now_rfc3339() -> str:
now = datetime.now(timezone.utc).replace(microsecond=0)
return now.isoformat().replace("+00:00", "Z")
def record_location(session: Session, payload: LocationRecordRequest) -> None:
stmt = (
insert(Location)
.prefix_with("OR IGNORE")
.values(
person=payload.person,
datetime=_utc_now_rfc3339(),
latitude=_parse_required_float(payload.latitude, "latitude"),
longitude=_parse_required_float(payload.longitude, "longitude"),
altitude=_parse_optional_float_compat(payload.altitude),
)
)
session.execute(stmt)
session.commit()
+112
View File
@@ -0,0 +1,112 @@
from __future__ import annotations
from dataclasses import dataclass
from datetime import datetime, timezone
import logging
from sqlalchemy import desc, insert, select
from sqlalchemy.orm import Session
from app.config import Settings
from app.integrations.homeassistant import (
HomeAssistantClient,
HomeAssistantConfigError,
HomeAssistantRequestError,
)
from app.models.poo import PooRecord
from app.schemas.poo import PooRecordRequest
logger = logging.getLogger(__name__)
@dataclass(slots=True)
class LatestPooRecord:
timestamp: str
status: str
latitude: float
longitude: float
def _parse_required_float(value: str, field_name: str) -> float:
try:
return float(value)
except (TypeError, ValueError) as exc:
raise ValueError(f"Invalid numeric value for {field_name}") from exc
def _utc_now_minute_precision() -> str:
now = datetime.now(timezone.utc).replace(second=0, microsecond=0)
return now.strftime("%Y-%m-%dT%H:%MZ")
def record_poo(
session: Session,
payload: PooRecordRequest,
*,
settings: Settings,
homeassistant_client: HomeAssistantClient,
) -> None:
stmt = insert(PooRecord).prefix_with("OR IGNORE").values(
timestamp=_utc_now_minute_precision(),
status=payload.status,
latitude=_parse_required_float(payload.latitude, "latitude"),
longitude=_parse_required_float(payload.longitude, "longitude"),
)
session.execute(stmt)
session.commit()
try:
publish_latest_poo_status(
session=session,
settings=settings,
homeassistant_client=homeassistant_client,
)
except (HomeAssistantConfigError, HomeAssistantRequestError) as exc:
logger.warning("Failed to publish latest poo status to Home Assistant: %s", exc)
if settings.poo_webhook_id:
try:
homeassistant_client.trigger_webhook(
webhook_id=settings.poo_webhook_id,
body={"status": payload.status},
)
except (HomeAssistantConfigError, HomeAssistantRequestError) as exc:
logger.warning("Failed to trigger poo webhook on Home Assistant: %s", exc)
def get_latest_poo_record(session: Session) -> LatestPooRecord | None:
stmt = select(PooRecord).order_by(desc(PooRecord.timestamp)).limit(1)
record = session.execute(stmt).scalar_one_or_none()
if record is None:
logger.info("No poo record is available yet")
return None
return LatestPooRecord(
timestamp=record.timestamp,
status=record.status,
latitude=record.latitude,
longitude=record.longitude,
)
def publish_latest_poo_status(
*,
session: Session,
settings: Settings,
homeassistant_client: HomeAssistantClient,
) -> LatestPooRecord | None:
latest = get_latest_poo_record(session)
if latest is None:
logger.info("Skipping Home Assistant poo sensor publish because no poo record exists yet")
return None
record_time = datetime.fromisoformat(latest.timestamp.replace("Z", "+00:00")).astimezone()
homeassistant_client.publish_sensor(
entity_id=settings.poo_sensor_entity_name,
state=latest.status,
attributes={
"last_poo": record_time.strftime("%a | %Y-%m-%d | %H:%M"),
"friendly_name": settings.poo_sensor_friendly_name,
},
)
return latest
+191
View File
@@ -0,0 +1,191 @@
from __future__ import annotations
import ipaddress
import logging
from dataclasses import dataclass
from datetime import UTC, datetime
from typing import Callable, Literal
import httpx
from sqlalchemy import select
from sqlalchemy.orm import Session
from app.config import Settings
from app.models.public_ip import PublicIPHistory, PublicIPState
from app.services.config_page import build_runtime_settings
from app.services.email import EmailConfigurationError, EmailDeliveryError, send_public_ip_changed_email
logger = logging.getLogger(__name__)
PUBLIC_IP_PROVIDER_NAME = "ipify"
PUBLIC_IP_PROVIDER_URL = "https://api.ipify.org"
PUBLIC_IP_PROVIDER_TIMEOUT_SECONDS = 5.0
PublicIPResultStatus = Literal["first_seen", "unchanged", "changed", "error"]
PublicIPv4Fetcher = Callable[[], str]
class PublicIPCheckError(RuntimeError):
"""Raised when the public IPv4 provider cannot return a valid IPv4."""
@dataclass(slots=True)
class PublicIPCheckResult:
status: PublicIPResultStatus
checked_at: datetime
changed: bool
previous_ipv4: str | None = None
current_ipv4: str | None = None
def check_public_ipv4(
session: Session,
*,
fetch_public_ipv4: PublicIPv4Fetcher | None = None,
provider_name: str = PUBLIC_IP_PROVIDER_NAME,
) -> PublicIPCheckResult:
checked_at = _utc_now()
state = session.scalar(select(PublicIPState).where(PublicIPState.id == 1).limit(1))
try:
raw_ipv4 = (fetch_public_ipv4 or fetch_public_ipv4_from_provider)()
current_ipv4 = _validate_ipv4(raw_ipv4)
except PublicIPCheckError as exc:
logger.warning("Public IPv4 check failed: %s", exc)
if state is not None:
state.last_checked_at = checked_at
state.last_check_status = "error"
state.last_check_error = str(exc)
state.last_provider = provider_name
session.commit()
return PublicIPCheckResult(status="error", checked_at=checked_at, changed=False)
if state is None:
state = PublicIPState(
id=1,
current_ipv4=current_ipv4,
previous_ipv4=None,
first_seen_at=checked_at,
last_checked_at=checked_at,
last_changed_at=None,
last_check_status="first_seen",
last_check_error=None,
last_provider=provider_name,
)
session.add(state)
session.add(
PublicIPHistory(
ipv4=current_ipv4,
observed_at=checked_at,
change_type="first_seen",
provider=provider_name,
)
)
session.commit()
return PublicIPCheckResult(
status="first_seen",
checked_at=checked_at,
changed=False,
current_ipv4=current_ipv4,
)
if state.current_ipv4 == current_ipv4:
state.last_checked_at = checked_at
state.last_check_status = "unchanged"
state.last_check_error = None
state.last_provider = provider_name
session.commit()
return PublicIPCheckResult(
status="unchanged",
checked_at=checked_at,
changed=False,
current_ipv4=current_ipv4,
)
previous_ipv4 = state.current_ipv4
state.previous_ipv4 = previous_ipv4
state.current_ipv4 = current_ipv4
state.last_checked_at = checked_at
state.last_changed_at = checked_at
state.last_check_status = "changed"
state.last_check_error = None
state.last_provider = provider_name
session.add(
PublicIPHistory(
ipv4=current_ipv4,
observed_at=checked_at,
change_type="changed",
provider=provider_name,
)
)
session.commit()
return PublicIPCheckResult(
status="changed",
checked_at=checked_at,
changed=True,
previous_ipv4=previous_ipv4,
current_ipv4=current_ipv4,
)
def check_public_ipv4_and_notify(
session: Session,
*,
bootstrap_settings: Settings,
fetch_public_ipv4: PublicIPv4Fetcher | None = None,
provider_name: str = PUBLIC_IP_PROVIDER_NAME,
) -> PublicIPCheckResult:
result = check_public_ipv4(
session,
fetch_public_ipv4=fetch_public_ipv4,
provider_name=provider_name,
)
if result.status != "changed" or result.previous_ipv4 is None or result.current_ipv4 is None:
return result
runtime_settings = build_runtime_settings(session, bootstrap_settings)
try:
send_public_ip_changed_email(
runtime_settings,
previous_ipv4=result.previous_ipv4,
current_ipv4=result.current_ipv4,
detected_at=result.checked_at,
)
except (EmailConfigurationError, EmailDeliveryError) as exc:
logger.warning("Public IPv4 change notification failed: %s", exc)
return result
def fetch_public_ipv4_from_provider() -> str:
try:
response = httpx.get(
PUBLIC_IP_PROVIDER_URL,
params={"format": "text"},
timeout=PUBLIC_IP_PROVIDER_TIMEOUT_SECONDS,
)
response.raise_for_status()
except httpx.HTTPError as exc:
raise PublicIPCheckError(f"provider request failed: {exc}") from exc
return response.text.strip()
def _validate_ipv4(raw_value: str) -> str:
if not raw_value:
raise PublicIPCheckError("provider returned an empty response")
try:
parsed = ipaddress.ip_address(raw_value)
except ValueError as exc:
raise PublicIPCheckError("provider returned an invalid IPv4 value") from exc
if parsed.version != 4:
raise PublicIPCheckError("provider returned a non-IPv4 value")
return str(parsed)
def _utc_now() -> datetime:
return datetime.now(UTC)
+6
View File
@@ -0,0 +1,6 @@
from app.config import Settings
def build_status_payload(settings: Settings) -> dict[str, str]:
return {"status": "ok", "environment": settings.app_env}
+245
View File
@@ -0,0 +1,245 @@
:root {
--bg: #f4f1ea;
--panel: rgba(255, 255, 255, 0.88);
--text: #1f2933;
--muted: #5b6875;
--accent: #2d6a4f;
--border: rgba(31, 41, 51, 0.08);
}
* {
box-sizing: border-box;
}
body {
margin: 0;
min-height: 100vh;
font-family: "Iowan Old Style", "Palatino Linotype", "Book Antiqua", serif;
color: var(--text);
background:
radial-gradient(circle at top left, rgba(45, 106, 79, 0.18), transparent 28%),
linear-gradient(160deg, #f7f4ee 0%, #ece6d8 100%);
}
.shell {
width: min(880px, calc(100% - 32px));
margin: 48px auto;
}
.panel {
padding: 32px;
border: 1px solid var(--border);
border-radius: 24px;
background: var(--panel);
backdrop-filter: blur(12px);
box-shadow: 0 20px 60px rgba(31, 41, 51, 0.12);
}
.eyebrow {
margin: 0 0 8px;
font-size: 0.85rem;
letter-spacing: 0.12em;
text-transform: uppercase;
color: var(--accent);
}
h1 {
margin: 0 0 16px;
font-size: clamp(2rem, 4vw, 3.2rem);
}
.lead {
margin: 0 0 24px;
line-height: 1.7;
color: var(--muted);
}
.meta {
display: grid;
grid-template-columns: repeat(auto-fit, minmax(180px, 1fr));
gap: 16px;
margin: 0;
}
.single-column {
grid-template-columns: minmax(180px, 320px);
margin-bottom: 24px;
}
.meta div {
padding: 16px;
border-radius: 16px;
background: rgba(255, 255, 255, 0.7);
border: 1px solid rgba(31, 41, 51, 0.06);
}
.meta dt {
margin-bottom: 8px;
font-size: 0.9rem;
color: var(--muted);
}
.meta dd {
margin: 0;
font-size: 1.05rem;
}
a {
color: var(--accent);
}
.auth-panel {
max-width: 520px;
margin-inline: auto;
}
.auth-form,
.logout-form {
display: grid;
gap: 16px;
}
.auth-form label {
display: grid;
gap: 8px;
font-size: 0.95rem;
color: var(--muted);
}
.auth-form input {
width: 100%;
padding: 12px 14px;
border: 1px solid rgba(31, 41, 51, 0.14);
border-radius: 12px;
background: rgba(255, 255, 255, 0.92);
color: var(--text);
font: inherit;
}
button {
width: fit-content;
min-width: 120px;
padding: 12px 18px;
border: none;
border-radius: 999px;
background: var(--accent);
color: white;
font: inherit;
cursor: pointer;
}
button:hover {
filter: brightness(1.04);
}
.alert {
margin-bottom: 16px;
padding: 12px 14px;
border-radius: 12px;
background: rgba(157, 37, 37, 0.08);
border: 1px solid rgba(157, 37, 37, 0.14);
color: #8b2a2a;
}
.notice {
margin-bottom: 16px;
padding: 12px 14px;
border-radius: 12px;
background: rgba(45, 106, 79, 0.08);
border: 1px solid rgba(45, 106, 79, 0.14);
color: var(--accent);
}
.config-block + .config-block {
margin-top: 28px;
}
.config-block h2 {
margin: 0 0 16px;
font-size: 1.25rem;
}
.config-form {
display: grid;
gap: 20px;
}
.config-section {
margin: 0;
padding: 18px;
border: 1px solid rgba(31, 41, 51, 0.08);
border-radius: 16px;
display: grid;
gap: 14px;
}
.config-section legend {
padding: 0 8px;
color: var(--accent);
}
.config-form label small {
color: var(--muted);
}
.integration-action-row {
display: flex;
align-items: center;
justify-content: space-between;
gap: 16px;
padding-top: 8px;
border-top: 1px solid rgba(31, 41, 51, 0.08);
}
.integration-action-title {
margin: 0 0 6px;
font-weight: 600;
color: var(--text);
}
.integration-action-copy {
margin: 0;
color: var(--muted);
line-height: 1.5;
}
.button-link {
display: inline-flex;
align-items: center;
justify-content: center;
width: fit-content;
min-width: 120px;
padding: 12px 18px;
border: none;
border-radius: 999px;
background: var(--accent);
color: white;
text-decoration: none;
cursor: pointer;
}
.button-link:hover {
filter: brightness(1.04);
}
.button-link.disabled {
background: rgba(91, 104, 117, 0.28);
color: rgba(31, 41, 51, 0.72);
cursor: not-allowed;
pointer-events: none;
}
@media (max-width: 640px) {
.shell {
margin: 24px auto;
}
.panel {
padding: 24px;
}
.integration-action-row {
align-items: stretch;
flex-direction: column;
}
}
+16
View File
@@ -0,0 +1,16 @@
<!DOCTYPE html>
<html lang="zh-CN">
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1">
<title>{% block title %}{{ app_name }}{% endblock %}</title>
<link rel="icon" href="data:,">
<link rel="stylesheet" href="/static/styles.css">
</head>
<body>
<main class="shell">
{% block content %}{% endblock %}
</main>
</body>
</html>
+139
View File
@@ -0,0 +1,139 @@
{% extends "base.html" %}
{% block title %}Config · {{ app_name }}{% endblock %}
{% block content %}
<section class="panel">
<p class="eyebrow">Configuration</p>
<h1>Config</h1>
{% if force_password_change %}
<div class="alert">
首次登录后需要先修改密码。完成后再继续长期使用当前配置页面。
</div>
{% endif %}
{% if password_change_error %}
<div class="alert">{{ password_change_error }}</div>
{% endif %}
{% if config_error %}
<div class="alert">{{ config_error }}</div>
{% endif %}
{% if config_saved %}
<div class="notice">config saved to the app database. Some changes may require an app restart.</div>
{% endif %}
{% if ticktick_oauth_error %}
<div class="alert">{{ ticktick_oauth_error }}</div>
{% endif %}
{% if ticktick_oauth_notice %}
<div class="notice">{{ ticktick_oauth_notice }}</div>
{% endif %}
{% if smtp_test_error %}
<div class="alert">{{ smtp_test_error }}</div>
{% endif %}
{% if smtp_test_notice %}
<div class="notice">{{ smtp_test_notice }}</div>
{% endif %}
<div class="meta single-column">
<div>
<dt>当前用户</dt>
<dd>admin</dd>
</div>
</div>
<section class="config-block">
<h2>Change Password</h2>
<form class="auth-form" method="post" action="/config/change-password">
<input type="hidden" name="csrf_token" value="{{ csrf_token }}">
<label>
<span>Current Password</span>
<input type="password" name="current_password" autocomplete="current-password" required>
</label>
<label>
<span>New Password</span>
<input type="password" name="new_password" autocomplete="new-password" required>
</label>
<label>
<span>Confirm New Password</span>
<input type="password" name="confirm_password" autocomplete="new-password" required>
</label>
<button type="submit">修改密码</button>
</form>
</section>
<section class="config-block">
<h2>Config</h2>
<form class="config-form" method="post" action="/config">
<input type="hidden" name="csrf_token" value="{{ csrf_token }}">
{% for section in config_sections %}
<fieldset class="config-section">
<legend>{{ section.name }}</legend>
{% for field in section.fields %}
<label>
<span>{{ field.label }}</span>
{% if field.secret %}
<input type="{{ field.input_type }}" name="{{ field.env_name }}" value="" placeholder="leave blank to keep current value">
<small>{% if field.configured %}configured{% else %}not configured{% endif %}</small>
{% else %}
<input type="{{ field.input_type }}" name="{{ field.env_name }}" value="{{ field.value }}">
{% endif %}
</label>
{% endfor %}
{% if section.name == "TickTick" %}
<div class="integration-action-row">
<div>
<p class="integration-action-title">TickTick OAuth</p>
<p class="integration-action-copy">Redirect URI: {{ ticktick_redirect_uri or "configure APP_HOSTNAME to generate the callback URI" }}</p>
{% if ticktick_oauth_ready %}
<p class="integration-action-copy">Use the saved TickTick client settings to start the authorization flow.</p>
{% else %}
<p class="integration-action-copy">Fill in App Hostname, TickTick Client ID, and TickTick Client Secret before starting OAuth.</p>
{% endif %}
</div>
{% if ticktick_oauth_ready %}
<a class="button-link" href="/ticktick/auth/start">Authorize TickTick</a>
{% else %}
<span class="button-link disabled" aria-disabled="true">Authorize TickTick</span>
{% endif %}
</div>
{% endif %}
{% if section.name == "SMTP" %}
<div class="integration-action-row">
<div>
<p class="integration-action-title">SMTP Test Email</p>
<p class="integration-action-copy">Save the SMTP settings first, then send a simple plaintext test email to the configured recipient.</p>
</div>
{% if smtp_test_ready %}
<button type="submit" formaction="/config/smtp/test" formmethod="post">Send SMTP Test</button>
{% else %}
<span class="button-link disabled" aria-disabled="true">Send SMTP Test</span>
{% endif %}
</div>
{% endif %}
</fieldset>
{% endfor %}
<button type="submit">Save Config</button>
</form>
</section>
<form class="logout-form" method="post" action="/logout">
<input type="hidden" name="csrf_token" value="{{ csrf_token }}">
<button type="submit">登出</button>
</form>
</section>
{% endblock %}
+36
View File
@@ -0,0 +1,36 @@
{% extends "base.html" %}
{% block title %}{{ app_name }}{% endblock %}
{% block content %}
<section class="panel">
<p class="eyebrow">Python Rewrite Skeleton</p>
<h1>{{ app_name }}</h1>
<p class="lead">
这是当前 Go 后端的 Python 重构基础骨架。此阶段仅提供应用入口、配置、数据库、
测试、模板和容器化基础,不包含业务逻辑迁移。
</p>
<dl class="meta">
<div>
<dt>运行环境</dt>
<dd>{{ app_env }}</dd>
</div>
<div>
<dt>健康检查</dt>
<dd><a href="/status">/status</a></dd>
</div>
<div>
<dt>OpenAPI</dt>
<dd><a href="/docs">/docs</a></dd>
</div>
<div>
<dt>登录</dt>
<dd><a href="/login">/login</a></dd>
</div>
<div>
<dt>Notion</dt>
<dd>{{ notion_status }}</dd>
</div>
</dl>
</section>
{% endblock %}
+33
View File
@@ -0,0 +1,33 @@
{% extends "base.html" %}
{% block title %}登录 · {{ app_name }}{% endblock %}
{% block content %}
<section class="panel auth-panel">
<p class="eyebrow">Authentication</p>
<h1>登录</h1>
<p class="lead">
登录成功后会进入受保护的 config 页面。
</p>
{% if error_message %}
<div class="alert">{{ error_message }}</div>
{% endif %}
<form class="auth-form" method="post" action="/login">
<input type="hidden" name="csrf_token" value="{{ csrf_token }}">
<label>
<span>Username</span>
<input type="text" name="username" autocomplete="username" required>
</label>
<label>
<span>Password</span>
<input type="password" name="password" autocomplete="current-password" required>
</label>
<button type="submit">登录</button>
</form>
</section>
{% endblock %}
+6
View File
@@ -0,0 +1,6 @@
-r requirements.in
httpx>=0.28,<1.0
pip-tools>=7.4,<8.0
pytest>=8.3,<9.0
+134
View File
@@ -0,0 +1,134 @@
#
# This file is autogenerated by pip-compile with Python 3.13
# by the following command:
#
# pip-compile dev-requirements.in
#
alembic==1.18.4
# via -r requirements.in
annotated-types==0.7.0
# via pydantic
anyio==4.13.0
# via
# httpx
# starlette
# watchfiles
apscheduler==3.11.2
# via -r requirements.in
argon2-cffi==25.1.0
# via -r requirements.in
argon2-cffi-bindings==25.1.0
# via argon2-cffi
build==1.4.3
# via pip-tools
certifi==2026.2.25
# via
# httpcore
# httpx
cffi==2.0.0
# via argon2-cffi-bindings
click==8.3.2
# via
# pip-tools
# uvicorn
fastapi==0.115.14
# via -r requirements.in
greenlet==3.4.0
# via sqlalchemy
h11==0.16.0
# via
# httpcore
# uvicorn
httpcore==1.0.9
# via httpx
httptools==0.7.1
# via uvicorn
httpx==0.28.1
# via
# -r dev-requirements.in
# -r requirements.in
idna==3.11
# via
# anyio
# httpx
iniconfig==2.3.0
# via pytest
jinja2==3.1.6
# via -r requirements.in
mako==1.3.11
# via alembic
markupsafe==3.0.3
# via
# jinja2
# mako
packaging==26.1
# via
# build
# pytest
# wheel
pip-tools==7.5.3
# via -r dev-requirements.in
pluggy==1.6.0
# via pytest
pycparser==2.23
# via cffi
pydantic==2.13.2
# via
# fastapi
# pydantic-settings
pydantic-core==2.46.2
# via pydantic
pydantic-settings==2.13.1
# via -r requirements.in
pygments==2.20.0
# via pytest
pyproject-hooks==1.2.0
# via
# build
# pip-tools
pytest==8.4.2
# via -r dev-requirements.in
python-dotenv==1.2.2
# via
# pydantic-settings
# uvicorn
python-multipart==0.0.26
# via -r requirements.in
pyyaml==6.0.3
# via
# -r requirements.in
# uvicorn
sqlalchemy==2.0.49
# via
# -r requirements.in
# alembic
starlette==0.46.2
# via fastapi
typing-extensions==4.15.0
# via
# alembic
# fastapi
# pydantic
# pydantic-core
# sqlalchemy
# typing-inspection
typing-inspection==0.4.2
# via
# pydantic
# pydantic-settings
tzlocal==5.3.1
# via apscheduler
uvicorn[standard]==0.44.0
# via -r requirements.in
uvloop==0.22.1
# via uvicorn
watchfiles==1.1.1
# via uvicorn
websockets==16.0
# via uvicorn
wheel==0.46.3
# via pip-tools
# The following packages are considered to be unsafe in a requirements file:
# pip
# setuptools
+6
View File
@@ -0,0 +1,6 @@
services:
migration:
build: .
app:
build: .
+46
View File
@@ -0,0 +1,46 @@
services:
migration:
container_name: home-automation-migration
image: code.wanderingbadger.dev/tliu93/home-automation:latest
user: "1000:1000"
restart: "no"
init: true
command: ["python", "-m", "scripts.run_migrations"]
volumes:
- ./data:/app/data
- ./.env:/app/.env:ro
app:
container_name: home-automation-app
image: code.wanderingbadger.dev/tliu93/home-automation:latest
user: "1000:1000"
restart: unless-stopped
init: true
depends_on:
migration:
condition: service_completed_successfully
ports:
- "127.0.0.1:8881:8000"
volumes:
- ./data:/app/data
- ./.env:/app/.env:ro
grafana:
image: grafana/grafana:latest
container_name: home-automation-grafana
depends_on:
- app
restart: unless-stopped
ports:
- "10.238.75.70:8882:3000"
environment:
GF_PLUGINS_PREINSTALL: frser-sqlite-datasource
volumes:
- ./data:/data/home-automation:ro
- ./grafana/provisioning:/etc/grafana/provisioning:ro
- ./grafana/dashboards:/var/lib/grafana/dashboards:ro
- homeautomation_grafana_storage:/var/lib/grafana
volumes:
homeautomation_grafana_storage:
name: homeautomation_grafana_storage
+5
View File
@@ -0,0 +1,5 @@
#!/bin/sh
set -eu
exec "$@"
+98
View File
@@ -0,0 +1,98 @@
# Python 骨架架构概览
本文档说明当前 Python skeleton 的职责边界与目录组织。它描述的是“后续迁移承载体”,不是完整业务实现。
## 当前目标
这一轮的目标是提供一个稳定、轻量、可持续扩展的基础工程,使后续可以逐步迁移:
- TickTick integration
- Home Assistant integration
- poo records
- location / life trajectory
## 目录设计
### `app/`
应用核心代码目录。
- `main.py`
- FastAPI app factory
- lifespan
- 基础路由注册
- `config.py`
- 环境变量驱动的 settings
- `auth_db.py`
- app 级共享 auth 数据库
- `db.py`
- SQLAlchemy engine / session / Base
- `dependencies.py`
- 通用依赖注入
- `api/`
- HTTP routes
- 当前已迁入 `/login``/logout``/admin`
- 当前已迁入 `GET /public-ip/check`
- 当前已迁入 `POST /homeassistant/publish` 第一版入口
- 当前已迁入 `POST /poo/record``GET /poo/latest`
- `models/`
- SQLAlchemy models
- 当前 `auth``location``poo` 使用各自独立的数据库 base
- `schemas/`
- Pydantic schemas
- `services/`
- 业务服务层
- 当前已迁入 config page 的 DB 持久化逻辑
- 当前已迁入 public IPv4 检查、状态持久化与变化通知逻辑
- 当前已迁入 SMTP 发信与测试发信逻辑
- `integrations/`
- 外部系统适配层
- 当前已迁入 Home Assistant outbound adapter
- `templates/`
- Jinja2 模板
- `static/`
- 极简静态资源
### `alembic_location/`
Location DB 的 migration 基础设施。
### `alembic_app/`
App DB 的 migration 基础设施。
### `alembic_poo/`
Poo DB 的 migration 基础设施。
### `tests/`
pytest 测试目录。后续可以在这里自然扩展:
- unit tests
- mock tests
- integration tests
### `scripts/`
辅助脚本目录。当前包含 OpenAPI 导出脚本。
## 当前约束
- 当前只搭骨架,不迁业务逻辑
- 当前数据库继续使用 SQLite
- 当前不引入前后端分离
- 当前不设计 Notion 模块
- 当前通知能力仍保持极小范围,不引入独立通知中心或多渠道抽象
## 关于 Notion
Notion 在 Go 版本中仍是现状模块,但在 Python 重构中已经明确属于 removed scope。
因此当前 Python skeleton
- 不提供 Notion integration 模块
- 不提供 Notion schema
- 不预留 Notion 相关业务流
如果未来需要回顾其历史作用,应继续参考 Go 版本和现有迁移盘点文档,而不是在 Python 骨架中保留它。
+120
View File
@@ -0,0 +1,120 @@
# 基础鉴权说明
本文档说明当前 Python 重构项目里已经落地的第一版鉴权基座。
这一轮只解决:
- 登录页
- 登录 / 登出流程
- server-side session
- 一个最小受保护页面
这一轮明确不解决:
- 完整 config persistence
- 完整 config CRUD
- 多用户权限系统
- OAuth / SSO / RBAC
## 当前 auth 模型
- 认证方式:`username/password`
- 会话方式:server-side session
- 客户端凭据:session cookie
- 页面形态:Jinja server-side template
## 当前持久化
当前新增一个共享 App DB
- `APP_DATABASE_URL`
- 默认值:`sqlite:///./data/app.db`
当前 auth 相关数据存放在这个 DB 中:
- `auth_users`
- `auth_sessions`
- `app_config`
当前没有把 auth 数据和 `location` / `poo` DB 混放。
当前这部分现在也走 Alembic 管理:
- Alembic 环境:`alembic_app.ini` + `alembic_app/`
- 初始化脚本:`python scripts/app_db_adopt.py`
当前没有 legacy app DB,所以这一版脚本只负责初始化新库,不负责 legacy adoption。
`app_config` 现在承接运行时配置持久化。
其中:
- `.env` 负责 bootstrap / fallback
- `app_config` 表负责运行时配置覆盖
- 登录密码仍然属于认证数据,使用 Argon2 哈希,不存进 `app_config`
## 首次启动与 bootstrap
如果 auth DB 中还没有任何用户,应用启动时会要求:
- `AUTH_BOOTSTRAP_USERNAME`
- `AUTH_BOOTSTRAP_PASSWORD`
并创建首个 admin 用户。
当前默认 bootstrap 值就是:
- username: `admin`
- password: `admin`
首次登录后,系统会强制要求修改密码。
如果你希望在首次启动前就覆盖默认值,可以直接设置环境变量:
- `AUTH_BOOTSTRAP_USERNAME`
- `AUTH_BOOTSTRAP_PASSWORD`
建议流程是:
1. 配好 `.env`
2. 运行 `python scripts/app_db_adopt.py`
3. 启动应用
4. 用 `admin / admin` 首次登录
5. 立即修改密码
## 安全设计
当前这版已经落实的基础安全点:
- 密码不明文存储,使用 Argon2 哈希
- session cookie 为 `HttpOnly`
- cookie 使用 `SameSite=Lax`
- `Secure` cookie 在非 `development` 环境默认开启
- 登录表单与登出表单都有基础 CSRF 校验
- session token 为随机生成,服务端只持久化 token hash
- session 有过期时间与显式失效机制
## 当前受保护范围
当前这轮只保护了页面入口:
- `GET /config`
- `POST /config`
- `POST /config/change-password`
- `POST /logout`
相关流程:
- `GET /login`
- `POST /login`
未登录访问 `/config` 时会被重定向到 `/login`
## 下一步不在本轮内
后续可以在这个基座上继续做:
- 配置页面接入
- config persistence
- 更细的受保护路由范围
- 用户初始化 / 密码轮换的更正式 runbook
+67
View File
@@ -0,0 +1,67 @@
# Home Assistant Inbound Gateway
本文档说明当前 Python 项目中已经迁入的 Home Assistant inbound gateway 第一版。
这里的 inbound 指:
- Home Assistant 主动调用当前 app 的入口
当前已恢复的入口是:
- `POST /homeassistant/publish`
## Request Envelope
当前沿用 legacy Go 的 envelope 形状:
```json
{
"target": "location_recorder",
"action": "record",
"content": "{'person': 'alice', 'latitude': '1.23', 'longitude': '4.56'}"
}
```
说明:
- `target``action``content` 均为必填
- unknown field 会被拒绝
- `content` 当前仍兼容 legacy 常见的单引号 JSON 字符串风格
## 当前已支持的 Target / Action
当前已接回的路径:
- `location_recorder / record`
- `ticktick / create_action_task`
其中:
- `location_recorder / record` 会把 `content` 解析为 location recorder 请求,并直接走当前 Python 项目里的 location 写入逻辑
- `ticktick / create_action_task` 会沿用 legacy 行为,把 `content` 解析为:
- `action: string`
- `due_hour: int`
- 可选 `title` 字段会被忽略
- TickTick task title 仍使用 `action`
- due date 仍按 legacy 语义计算:先取 `now + due_hour`,再落到该日期的“次日零点”,最后转成 UTC 后写给 TickTick
- 具体 project 仍由 `HOME_ASSISTANT_ACTION_TASK_PROJECT_ID` 提供
## 当前尚未接回
以下 legacy 路径在当前阶段还没有迁入:
- `poo_recorder / get_latest`
- 其他未定义 target/action
这些请求当前会返回:
- `500 internal server error`
## 错误处理
当前策略保持简洁:
- envelope 非法、缺字段、unknown field、`content` 非法:返回 `400 bad request`
- target/action 当前未迁入:返回 `500 internal server error`
对 caller 的响应体保持简洁,不暴露过多内部细节;更详细原因只写日志。
+51
View File
@@ -0,0 +1,51 @@
# Home Assistant Outbound Integration
本文档说明当前 Python 项目中已经迁入的 Home Assistant outbound integration layer。
这里的 outbound 指:
- 由当前 app 主动调用 Home Assistant
当前不包含:
- `/homeassistant/publish`
- Home Assistant inbound command gateway
- Home Assistant 驱动当前 app 的入站消息路由
## 当前已支持能力
当前 `app/integrations/homeassistant.py` 提供一个轻量的 `HomeAssistantClient`,已支持:
- 发布 / 更新 sensor state
- `POST /api/states/{entity_id}`
- 触发 Home Assistant webhook
- `POST /api/webhook/{webhook_id}`
这两项能力是按 legacy Go 中 `util/homeassistantutil/homeassistantutil.go` 的出站行为迁入的。
## 当前配置
当前 outbound adapter 依赖以下配置:
- `HOME_ASSISTANT_BASE_URL`
- `HOME_ASSISTANT_AUTH_TOKEN`
- `HOME_ASSISTANT_TIMEOUT_SECONDS`
如果缺少必要配置,client 会直接抛出配置错误,而不是静默跳过。
## 错误处理策略
当前策略保持保守和简单:
- 配置缺失:抛出 `HomeAssistantConfigError`
- 参数明显非法:抛出 `ValueError`
- Home Assistant 返回非 200/201:抛出 `HomeAssistantRequestError`
- 网络请求失败:抛出 `HomeAssistantRequestError`
当前还没有做:
- 自动重试
- 熔断
- 更复杂的 backoff 策略
这一轮重点是先把 app -> Home Assistant 的出站契约和可复用结构迁进来。
+176
View File
@@ -0,0 +1,176 @@
# Location Recorder
本文档说明 `location recorder` 在 Python 项目中的当前数据库接管策略,以及 legacy SQLite 接管 runbook。
当前 Python 版本的 `POST /location/record` 请求校验策略是:
- `latitude``longitude` 为必填,缺失或无法解析成合法数值时返回 `400 bad request`
- `altitude` 为可选,缺失或非法时按 `0` 处理
- unknown field 仍返回 `400 bad request`
- 对 caller 的错误响应保持简洁,不直接暴露底层校验细节;详细原因只写日志
## Legacy 事实基线
当前 legacy SQLite 中 `location` 表的真实 schema 为:
```sql
CREATE TABLE location (
person TEXT NOT NULL,
datetime TEXT NOT NULL,
latitude REAL NOT NULL,
longitude REAL NOT NULL,
altitude REAL,
PRIMARY KEY (person, datetime)
);
```
历史上 legacy Go 实现使用:
```sql
PRAGMA user_version = 2;
```
这代表旧系统曾依赖 `user_version` 管理 location 数据库版本,但这不再是 Python 项目的长期 migration 机制。
## 当前策略
当前采用的最小必要接管方案是:
1. 把上述 `location` schema 视为 Alembic baseline
2. 新数据库通过 Alembic `upgrade head` 初始化
3. 已有 legacy SQLite 数据库,只要确认 schema 与 baseline 一致,再通过 `alembic stamp` 接管
4. 如果数据库已经存在 `alembic_version`,则必须先确认当前 revision 与项目预期 baseline 一致
5. 只有 revision 一致时,才视为该库已经被正确接管
6. 未来不再以 `PRAGMA user_version` 作为主 migration 机制
当前 baseline revision 是:
- `20260419_01_location_baseline`
当前提供的最小脚本入口是:
```bash
python scripts/location_db_adopt.py
```
如果你更喜欢模块方式运行,也可以用:
```bash
python -m scripts.location_db_adopt
```
它只针对 `LOCATION_DATABASE_URL` 工作,并且遵守保守接管原则:
- 本地已有 DB 文件:先校验,再接管
- 本地没有 DB 文件:按新库初始化
- 任一校验不通过:立即报错并停止
应用本身在启动时不会自动替你初始化 `location` 数据库。
应用启动时会对 `LOCATION_DATABASE_URL` 做只读校验:
- 文件不存在:直接报错,并提示先运行接管脚本
- 文件存在但还没有 `alembic_version`:直接报错,要求先完成 legacy 接管
- 文件已被 Alembic 管理但 revision 不匹配:直接报错并拒绝启动
这是有意为之,用来避免应用在错误路径上静默创建新库,或带着错误数据库版本继续跑业务。
## 新数据库初始化
如果本地不存在 `LOCATION_DATABASE_URL` 指向的 DB 文件:
- 脚本会先创建父目录
- 然后执行 Alembic `upgrade head`
- 最终建立 `location` 表与 `alembic_version`
手工执行时也等价于:
```bash
alembic upgrade head
```
这会创建与 legacy 相同的 `location` 表结构,并在库中建立 Alembic revision 记录。
## 旧数据库接管
对于已经存在的 legacy SQLite 数据库:
1. 先确认 DB 文件存在
2. 如果已经存在 `alembic_version` 表,则先读取当前 revision
3. 如果 revision 等于 `20260419_01_location_baseline`,则视为该库已经被 Alembic 正确接管
4. 如果 revision 不匹配,立即报错并停止,不做任何自动修复
5. 如果还没有 `alembic_version` 表,则读取当前 DB 中 `location` 表的实际 schema
6. 与 baseline schema 做严格比对
7. 再检查 `PRAGMA user_version`
8. 只有 schema 匹配且 `user_version = 2` 时,才执行 Alembic `stamp`
9. 接管完成后,后续 migration 才交给 Alembic 管理
示例:
```bash
LOCATION_DATABASE_URL=sqlite:///./data/locationRecorder.db alembic stamp 20260419_01_location_baseline
```
或直接执行脚本:
```bash
LOCATION_DATABASE_URL=sqlite:///./data/locationRecorder.db python scripts/location_db_adopt.py
```
这样做的含义是:
- 告诉 Alembic:这个数据库已经处于 baseline 结构
- 不修改已有 `location` 表数据
- 后续 migration 由 Alembic 接管
## Fail Closed 原则
当前策略是保守接管,不做未知 legacy 状态的自动修复。
如果出现以下任一情况,脚本会直接报错并停止:
- 找不到 `location`
- `location` 表 schema 与 baseline 不一致
- `PRAGMA user_version` 不等于 `2`
- 已有 `alembic_version`,但 revision 与预期 baseline 不一致
- 目标 DB 不是 SQLite URL
当前不会尝试:
- 自动修表
- 自动调整 `user_version`
- 自动推断未知 legacy 状态
如果发生这些情况,应先人工确认数据库状态,再决定是否需要单独迁移或修复。
## 关于 `data/locationRecorder.db`
你本地放在 `data/locationRecorder.db` 的 legacy 样本库,可以用于:
- 人工核对 schema
- 手动验证 `stamp` 接管流程
- 做开发时的兼容性确认
但当前代码不应硬依赖这个文件存在。
## 测试样本的安全使用方式
如果要用 legacy SQLite 样本做测试或验证,应遵守:
1. 不直接在原始样本文件上跑测试
2. 先复制到临时路径
3. 所有 `stamp`、写入、实验性 migration 都只针对副本执行
自动化测试里当前采用的方式是:
- 构造一个“legacy 风格”的临时 SQLite 文件
- 建出同样的 `location`
- 设置 `PRAGMA user_version = 2`
- 再执行接管脚本中的 adopt 逻辑
同时也覆盖:
- DB 文件不存在时的新库初始化路径
- schema 不匹配时的失败路径
- `user_version` 不匹配时的失败路径
这样可以验证接管路径,同时不污染真实样本库。
+140
View File
@@ -0,0 +1,140 @@
# Poo Recorder
本文档说明 `poo recorder` 在 Python 项目中的当前行为边界,以及 poo SQLite 的 Alembic 接管策略。
## 当前基线
当前生产版本中的真实 SQLite schema 为:
```sql
CREATE TABLE poo_records (
timestamp TEXT NOT NULL,
status TEXT NOT NULL,
latitude REAL NOT NULL,
longitude REAL NOT NULL,
PRIMARY KEY (timestamp)
);
```
历史上 legacy Go 实现使用:
```sql
PRAGMA user_version = 1;
```
当前 Python 迁移以这套 schema 为事实基线,不重新设计表结构。
## 当前已迁入的 API
当前 Python 项目已经接入:
- `POST /poo/record`
- `GET /poo/latest`
### `POST /poo/record`
用途:
- 记录一条 poo event
- 最佳努力地刷新 Home Assistant sensor
- 如果配置了 `POO_WEBHOOK_ID`,最佳努力地触发 Home Assistant webhook
请求体:
```json
{
"status": "done",
"latitude": "1.23",
"longitude": "4.56"
}
```
当前策略:
- unknown field`400 bad request`
- 数值非法:`400 bad request`
- 记录成功后,即使 Home Assistant side effect 失败,也不会回滚本地 DB 写入
### `GET /poo/latest`
用途:
- 读取最新一条 poo 记录
- 将其重新发布到 Home Assistant sensor
当前外部行为与 legacy 保持一致:
- 成功:空响应体,HTTP 200
- 如果当前 DB 里还没有任何 poo 记录:仍返回空响应体,HTTP 200,但不会发布 sensor
- 真正的发布失败:简洁 `internal server error`
## Home Assistant side effects
当前已复用 Python 项目中已有的 Home Assistant outbound adapter。
当前支持:
- 发布 / 更新 poo status sensor
- 可选触发 webhook
相关配置:
- `HOME_ASSISTANT_BASE_URL`
- `HOME_ASSISTANT_AUTH_TOKEN`
- `HOME_ASSISTANT_TIMEOUT_SECONDS`
- `POO_SENSOR_ENTITY_NAME`
- `POO_SENSOR_FRIENDLY_NAME`
- `POO_WEBHOOK_ID`
## Alembic 接管策略
poo 的接管逻辑刻意保持与 location 一致。
当前 baseline revision
- `20260420_01_poo_baseline`
当前提供的脚本入口:
```bash
python scripts/poo_db_adopt.py
```
或:
```bash
python -m scripts.poo_db_adopt
```
规则如下:
1. 如果本地不存在 poo DB 文件:
- 视为新库初始化
- 通过 `alembic_poo upgrade head` 创建新库
2. 如果本地已经存在 legacy DB:
- 先检查 `poo_records` 表 schema
- 再检查 `PRAGMA user_version = 1`
- 只有完全匹配,才通过 Alembic `stamp` 接管
3. 如果 schema 或 `user_version` 不匹配:
- 直接失败
- 不自动修复
4. 如果数据库已经存在 `alembic_version`
- 只有 revision 与当前 baseline 一致才接受
- 否则直接失败
同时,应用启动时也会对 `POO_DATABASE_URL` 做只读校验:
- 文件不存在:拒绝启动
- DB 尚未被 Alembic 接管:拒绝启动
- revision 不匹配:拒绝启动
## 明确移除 Notion
这一轮不会迁入任何 Notion 逻辑。
也就是说,当前 Python 版的 poo recorder
- 不保留 Notion adapter
- 不保留 Notion sync
- 不保留 `tableId` 依赖
- 不因为 legacy 中存在 Notion 就继续保留兼容层
+126
View File
@@ -0,0 +1,126 @@
# Public IPv4 Monitor 与邮件通知
本文档说明当前 public IPv4 monitor 与 SMTP 邮件通知能力的职责边界和运行方式。
## 当前范围
当前实现只覆盖一个很小的通知能力:
- 定期或手动检查当前公网 IPv4
- 将当前状态和变化历史持久化到 app DB
- 仅在公网 IPv4 发生变化时发送一封英文纯文本邮件
当前明确不包含:
- Namecheap API 自动更新
- IPv6 检查
- 错误告警邮件
- 重复提醒 / 升级告警
- Telegram / Slack / Discord 通知
- 完整通知中心或模板系统
## 数据存储
当前数据全部进入 app DB。
相关表:
- `public_ip_state`
- 保存当前状态
- 逻辑上通常只有一行
- `public_ip_history`
- 保存首次发现和变化历史
当前不会把 public IP 状态放进 `app_config`
## 检查结果语义
一次检查会返回以下四种结果之一:
- `first_seen`
- `unchanged`
- `changed`
- `error`
行为约束:
- `first_seen`:写入当前 IP 和首条 history,但不发通知邮件
- `unchanged`:只更新时间和状态,不写 history,不发邮件
- `changed`:更新 `previous_ipv4` / `current_ipv4` / `last_changed_at`,写入 history,并发送邮件
- `error`:保留已有有效 IP,不写伪 history,也不发邮件
## 手动检查与定时检查
手动检查入口:
- `GET /public-ip/check`
约束:
- 需要现有鉴权
- 响应不暴露 IP 本身
- 只返回非敏感检查结果
定时检查:
- 应用启动时注册 APScheduler job
- 默认每 4 小时执行一次
- 与手动检查复用同一套 public IP check + notify 逻辑
## SMTP 通知
当前通知发信复用现有 SMTP sender。
依赖的配置项:
- `SMTP_ENABLED`
- `SMTP_HOST`
- `SMTP_PORT`
- `SMTP_USERNAME`
- `SMTP_PASSWORD`
- `SMTP_FROM_NAME`
- `SMTP_FROM_ADDRESS`
- `SMTP_TO_ADDRESS`
- `SMTP_USE_STARTTLS`
其中:
- `SMTP_FROM_NAME` 用于邮件头显示名
- `From` 头会渲染成 `Name <mail@domain>`
- SMTP envelope sender 仍然使用纯邮箱地址,保持兼容性
## 通知触发条件
只有在 `changed` 时发邮件。
不会发邮件的情况:
- `first_seen`
- `unchanged`
- `error`
这使得同一 IP 状态不会被重复通知,因为在首次变更之后,后续重复检查会变成 `unchanged`
## 邮件内容
当前邮件标题固定为:
- `Public IP changed`
正文为英文纯文本,至少包含:
- previous IP
- current IP
- detected time
当前正文还会附带一句 Namecheap trusted IP 的人工更新提示。
## 失败处理
当前通知发送是“尽力而为”的附加动作:
- public IP 状态持久化先完成
- 邮件发送失败不会回滚 public IP 状态
- 失败只记录 warning 日志
这样可以避免通知链路反过来影响主检查流程。
+43
View File
@@ -0,0 +1,43 @@
# TickTick Integration
当前 Python 项目里的 TickTick 迁移先恢复 legacy 的最核心能力,不额外扩成更大的集成层。
## 当前已支持
- 运行时从 config 表读取 TickTick 配置,缺失时仍可 fallback `.env`
- `GET /ticktick/auth/start`
- 需要已登录 session
- 生成 OAuth `state`
- 直接重定向到 TickTick 授权页
- `GET /ticktick/auth/code`
- 校验进程内保存的 `state`
- 用 authorization code 换取 access token
- 将 `TICKTICK_TOKEN` 持久化到 `app_config`
- TickTick Open API 基础调用:
- 列 project
- 列 project 下 task
- 创建 task
- 按 title 精确匹配做重复创建保护
- Home Assistant inbound 已重新接回 `ticktick / create_action_task`
## 当前配置项
- `APP_HOSTNAME`
- `TICKTICK_CLIENT_ID`
- `TICKTICK_CLIENT_SECRET`
- `TICKTICK_TOKEN`
- `HOME_ASSISTANT_ACTION_TASK_PROJECT_ID`
## 兼容性说明
- 仍保留 legacy 的 OAuth authorization code flow
- OAuth callback URI 现在由 `APP_HOSTNAME` 和当前环境自动推导:`development` 使用 `http`,其他环境使用 `https`
- `state` 仍是进程内临时状态;如果服务在 start 和 callback 之间重启,本轮实现下授权需要重新开始
- 不再把 token 写回 `.env` 或其他配置文件,统一写入 config 表
- 当前没有引入 legacy 的第三方 TickTick 库,先用标准库完成兼容行为
## 后续适合单独拆分的工作
- 给 config 页面增加明确的 TickTick 授权入口
- 增加 project 探测或选择能力,减少手工填写 `HOME_ASSISTANT_ACTION_TASK_PROJECT_ID`
- 如果后续发现 OAuth/token 生命周期需要更强健,再补 refresh token 或持久化 auth state
+288
View File
@@ -0,0 +1,288 @@
{
"apiVersion": "dashboard.grafana.app/v2",
"kind": "Dashboard",
"metadata": {
"name": "adzr6rv",
"namespace": "default",
"uid": "c5fc57e5-7fb5-4104-9861-023710ada568",
"resourceVersion": "1776634346371016",
"generation": 19,
"creationTimestamp": "2026-04-18T19:05:57Z",
"labels": {
"grafana.app/deprecatedInternalID": "945374452785152"
},
"annotations": {
"grafana.app/createdBy": "user:ffjhknvgkvhtsc",
"grafana.app/folder": "",
"grafana.app/saved-from-ui": "Grafana v13.0.1 (a100054f)",
"grafana.app/updatedBy": "user:ffjhknvgkvhtsc",
"grafana.app/updatedTimestamp": "2026-04-19T21:32:26Z"
}
},
"spec": {
"annotations": [
{
"kind": "AnnotationQuery",
"spec": {
"query": {
"kind": "DataQuery",
"group": "grafana",
"version": "v0",
"datasource": {
"name": "-- Grafana --"
},
"spec": {}
},
"enable": true,
"hide": true,
"iconColor": "rgba(0, 211, 255, 1)",
"name": "Annotations & Alerts",
"builtIn": true
}
}
],
"cursorSync": "Off",
"editable": true,
"elements": {
"panel-1": {
"kind": "Panel",
"spec": {
"id": 1,
"title": "轨迹",
"description": "",
"links": [],
"data": {
"kind": "QueryGroup",
"spec": {
"queries": [
{
"kind": "PanelQuery",
"spec": {
"query": {
"kind": "DataQuery",
"group": "frser-sqlite-datasource",
"version": "v0",
"datasource": {
"name": "ffjhr941d5iwwf"
},
"spec": {
"queryText": "SELECT\n datetime AS time,\n latitude,\n longitude,\n altitude\nFROM location\nWHERE person = 'Jiangxue'\n AND datetime >= '2021-04-19T21:29:57.036Z'\n AND datetime <= '2026-04-19T21:29:57.036Z'\n AND latitude != 0\n AND longitude != 0\nORDER BY datetime;\n",
"queryType": "table",
"rawQueryText": "SELECT\n datetime AS time,\n latitude,\n longitude,\n altitude\nFROM location\nWHERE person = '$person'\n AND datetime >= '${__from:date:iso}'\n AND datetime <= '${__to:date:iso}'\n AND latitude != 0\n AND longitude != 0\nORDER BY datetime;\n",
"timeColumns": [
"time",
"ts"
]
}
},
"refId": "A",
"hidden": false
}
}
],
"transformations": [],
"queryOptions": {}
}
},
"vizConfig": {
"kind": "VizConfig",
"group": "geomap",
"version": "13.0.1",
"spec": {
"options": {
"basemap": {
"config": {
"server": "streets"
},
"name": "Layer 0",
"noRepeat": false,
"type": "default"
},
"controls": {
"mouseWheelZoom": true,
"showAttribution": true,
"showDebug": false,
"showMeasure": false,
"showScale": false,
"showZoom": true
},
"layers": [
{
"config": {
"showLegend": false,
"style": {
"color": {
"fixed": "blue"
},
"opacity": 0.7,
"rotation": {
"fixed": 0,
"max": 360,
"min": -360,
"mode": "mod"
},
"size": {
"fixed": 3,
"max": 15,
"min": 2
},
"symbol": {
"fixed": "img/icons/marker/circle.svg",
"mode": "fixed"
},
"symbolAlign": {
"horizontal": "center",
"vertical": "center"
},
"textConfig": {
"fontSize": 12,
"offsetX": 0,
"offsetY": 0,
"textAlign": "center",
"textBaseline": "middle"
}
}
},
"layer-tooltip": true,
"name": "path",
"tooltip": true,
"type": "markers"
}
],
"tooltip": {
"mode": "details"
},
"view": {
"allLayers": true,
"dashboardVariable": false,
"id": "fit",
"lat": 0,
"lon": 0,
"noRepeat": false,
"shared": false,
"zoom": 15
}
},
"fieldConfig": {
"defaults": {
"thresholds": {
"mode": "absolute",
"steps": [
{
"value": 0,
"color": "green"
}
]
},
"color": {
"mode": "thresholds"
},
"custom": {
"hideFrom": {
"legend": false,
"tooltip": false,
"viz": false
}
}
},
"overrides": []
}
}
}
}
}
},
"layout": {
"kind": "GridLayout",
"spec": {
"items": [
{
"kind": "GridLayoutItem",
"spec": {
"x": 0,
"y": 0,
"width": 24,
"height": 18,
"element": {
"kind": "ElementReference",
"name": "panel-1"
}
}
}
]
}
},
"links": [],
"liveNow": false,
"preload": false,
"tags": [],
"timeSettings": {
"timezone": "browser",
"from": "now-5y",
"to": "now",
"autoRefresh": "",
"autoRefreshIntervals": [
"5s",
"10s",
"30s",
"1m",
"5m",
"15m",
"30m",
"1h",
"2h",
"1d"
],
"hideTimepicker": false,
"fiscalYearStartMonth": 0
},
"title": "轨迹",
"variables": [
{
"kind": "QueryVariable",
"spec": {
"name": "person",
"current": {
"text": "Jiangxue",
"value": "Jiangxue"
},
"label": "person",
"hide": "dontHide",
"refresh": "onDashboardLoad",
"skipUrlSync": false,
"description": "",
"query": {
"kind": "DataQuery",
"group": "frser-sqlite-datasource",
"version": "v0",
"datasource": {
"name": "ffjhr941d5iwwf"
},
"spec": {
"__legacyStringValue": "SELECT DISTINCT person\nFROM location\nORDER BY person;\n"
}
},
"regex": "",
"regexApplyTo": "value",
"sort": "disabled",
"definition": "SELECT DISTINCT person\nFROM location\nORDER BY person;\n",
"options": [],
"multi": false,
"includeAll": false,
"allowCustomValue": true
}
}
],
"preferences": {
"layout": {
"kind": "AutoGridLayout",
"spec": {
"maxColumnCount": 3,
"columnWidthMode": "standard",
"rowHeightMode": "standard",
"items": []
}
}
}
}
}
+231
View File
@@ -0,0 +1,231 @@
{
"apiVersion": "dashboard.grafana.app/v2",
"kind": "Dashboard",
"metadata": {
"name": "adl5sjt",
"namespace": "default",
"uid": "d4c72406-9fc5-4b85-844b-be1250f1fa8b",
"resourceVersion": "1776606363367013",
"generation": 6,
"creationTimestamp": "2026-04-18T20:07:34Z",
"labels": {
"grafana.app/deprecatedInternalID": "960882027798528"
},
"annotations": {
"grafana.app/createdBy": "user:ffjhknvgkvhtsc",
"grafana.app/folder": "",
"grafana.app/saved-from-ui": "Grafana v13.0.1 (a100054f)",
"grafana.app/updatedBy": "user:ffjhknvgkvhtsc",
"grafana.app/updatedTimestamp": "2026-04-19T13:46:03Z"
}
},
"spec": {
"annotations": [
{
"kind": "AnnotationQuery",
"spec": {
"query": {
"kind": "DataQuery",
"group": "grafana",
"version": "v0",
"datasource": {
"name": "-- Grafana --"
},
"spec": {}
},
"enable": true,
"hide": true,
"iconColor": "rgba(0, 211, 255, 1)",
"name": "Annotations & Alerts",
"builtIn": true
}
}
],
"cursorSync": "Off",
"editable": true,
"elements": {
"panel-1": {
"kind": "Panel",
"spec": {
"id": 1,
"title": "Mika Poo",
"description": "Mika's poo",
"links": [],
"data": {
"kind": "QueryGroup",
"spec": {
"queries": [
{
"kind": "PanelQuery",
"spec": {
"query": {
"kind": "DataQuery",
"group": "frser-sqlite-datasource",
"version": "v0",
"datasource": {
"name": "ffjhkuu4hc3y8e"
},
"spec": {
"queryText": "SELECT\n latitude,\n longitude,\n timestamp\nFROM poo_records\nWHERE timestamp >= '${__from:date:iso}'\n AND timestamp <= '${__to:date:iso}'\n AND latitude != 0\n AND longitude != 0\nORDER BY timestamp;\n",
"queryType": "table",
"rawQueryText": "SELECT\n latitude,\n longitude,\n timestamp\nFROM poo_records\nWHERE timestamp >= '${__from:date:iso}'\n AND timestamp <= '${__to:date:iso}'\n AND latitude != 0\n AND longitude != 0\nORDER BY timestamp;\n",
"timeColumns": [
"time",
"ts"
]
}
},
"refId": "A",
"hidden": false
}
}
],
"transformations": [],
"queryOptions": {}
}
},
"vizConfig": {
"kind": "VizConfig",
"group": "geomap",
"version": "13.0.1",
"spec": {
"options": {
"basemap": {
"config": {},
"name": "Layer 0",
"noRepeat": false,
"type": "default"
},
"controls": {
"mouseWheelZoom": true,
"showAttribution": true,
"showDebug": false,
"showMeasure": false,
"showScale": false,
"showZoom": true
},
"layers": [
{
"config": {
"blur": 15,
"radius": 5,
"weight": {
"fixed": 1,
"max": 1,
"min": 0
}
},
"filterData": {
"id": "byRefId",
"options": "A"
},
"location": {
"mode": "auto"
},
"name": "Poo",
"tooltip": true,
"type": "heatmap"
}
],
"tooltip": {
"mode": "details"
},
"view": {
"allLayers": true,
"dashboardVariable": false,
"id": "zero",
"lat": 0,
"lon": 0,
"noRepeat": false,
"zoom": 1
}
},
"fieldConfig": {
"defaults": {
"thresholds": {
"mode": "absolute",
"steps": [
{
"value": 0,
"color": "green"
},
{
"value": 80,
"color": "red"
}
]
},
"color": {
"mode": "thresholds"
},
"custom": {
"hideFrom": {
"legend": false,
"tooltip": false,
"viz": false
}
}
},
"overrides": []
}
}
}
}
}
},
"layout": {
"kind": "GridLayout",
"spec": {
"items": [
{
"kind": "GridLayoutItem",
"spec": {
"x": 0,
"y": 0,
"width": 24,
"height": 19,
"element": {
"kind": "ElementReference",
"name": "panel-1"
}
}
}
]
}
},
"links": [],
"liveNow": false,
"preload": false,
"tags": [],
"timeSettings": {
"timezone": "browser",
"from": "now-5y",
"to": "now",
"autoRefresh": "",
"autoRefreshIntervals": [
"5s",
"10s",
"30s",
"1m",
"5m",
"15m",
"30m",
"1h",
"2h",
"1d"
],
"hideTimepicker": false,
"fiscalYearStartMonth": 0
},
"title": "Mika Poo",
"variables": [],
"preferences": {
"layout": {
"kind": "GridLayout",
"spec": {
"items": []
}
}
}
}
}
@@ -0,0 +1,13 @@
apiVersion: 1
providers:
- name: home-automation-dashboards
orgId: 1
folder: ""
type: file
disableDeletion: false
allowUiUpdates: false
updateIntervalSeconds: 30
options:
path: /var/lib/grafana/dashboards
foldersFromFilesStructure: false
@@ -0,0 +1,11 @@
apiVersion: 1
datasources:
- name: locationrecorder
uid: ffjhr941d5iwwf
type: frser-sqlite-datasource
access: proxy
isDefault: false
editable: false
jsonData:
path: /data/home-automation/locationRecorder.db
@@ -0,0 +1,11 @@
apiVersion: 1
datasources:
- name: poorecorder
uid: ffjhkuu4hc3y8e
type: frser-sqlite-datasource
access: proxy
isDefault: false
editable: false
jsonData:
path: /data/home-automation/pooRecorder.db
@@ -1,15 +0,0 @@
[program:home_automation_backend]
command=
directory=
user=
group=
environment=
autostart=true
autorestart=true
startsecs=15
startretries=100
stopwaitsecs=30
redirect_stderr=true
stdout_logfile=/var/log/supervisor/%(program_name)s.log
stdout_logfile_maxbytes=5MB
stdout_logfile_backups=5
-100
View File
@@ -1,100 +0,0 @@
#!/usr/bin/bash
# Argument parsing
if [[ $# -ne 1 ]]; then
echo "Usage: $0 [--install|--uninstall|--help]"
echo " --install Install the automation backend"
echo " --uninstall Uninstall the automation backend"
echo " --update Update the installation"
echo " --help Show this help message"
exit 0
fi
key="$1"
case $key in
--install)
INSTALL=true
;;
--uninstall)
UNINSTALL=true
;;
--update)
UPDATE=true
;;
--help)
echo "Usage: $0 [--install|--uninstall|--update|--help]"
echo " --install Install the automation backend"
echo " --uninstall Uninstall the automation backend"
echo " --update Update the installation"
echo " --help Show this help message"
exit 0
;;
*)
echo "Invalid argument: $key"
exit 1
;;
esac
TARGET_DIR="$HOME/.local/home-automation-backend"
SUPERVISOR_CFG_NAME="home_automation_backend"
APP_NAME="home-automation-backend"
SUPERVISOR_CFG="$SUPERVISOR_CFG_NAME.conf"
BASEDIR=$(dirname "$(realpath "$0")")
# Install or uninstall based on arguments
install_backend() {
# Installation code here
echo "Installing..."
sudo supervisorctl stop $SUPERVISOR_CFG_NAME
mkdir -p $TARGET_DIR
cd $BASEDIR"/../src/" && go build -o $TARGET_DIR/$APP_NAME
cp $BASEDIR/"$SUPERVISOR_CFG_NAME"_template.conf $BASEDIR/$SUPERVISOR_CFG
sed -i "s+command=+command=$TARGET_DIR/$APP_NAME serve+g" $BASEDIR/$SUPERVISOR_CFG
sed -i "s+directory=+directory=$TARGET_DIR+g" $BASEDIR/$SUPERVISOR_CFG
sed -i "s+user=+user=$USER+g" $BASEDIR/$SUPERVISOR_CFG
sed -i "s+group=+group=$USER+g" $BASEDIR/$SUPERVISOR_CFG
sed -i "s+environment=+environment=HOME=\"$HOME\"+g" $BASEDIR/$SUPERVISOR_CFG
sudo mv $BASEDIR/$SUPERVISOR_CFG /etc/supervisor/conf.d/$SUPERVISOR_CFG
sudo supervisorctl reread
sudo supervisorctl update
sudo supervisorctl start $SUPERVISOR_CFG_NAME
echo "Installation complete."
}
uninstall_backend() {
# Uninstallation code here
echo "Uninstalling..."
sudo supervisorctl stop $SUPERVISOR_CFG_NAME
sudo supervisorctl remove $SUPERVISOR_CFG_NAME
sudo rm /etc/supervisor/conf.d/$SUPERVISOR_CFG
rm -rf $TARGET_DIR/
echo "Uninstallation complete."
echo "Config files and db is stored in $HOME/.config/home-automation"
}
update_backend() {
uninstall_backend
install_backend
}
if [[ $INSTALL ]]; then
install_backend
elif [[ $UNINSTALL ]]; then
uninstall_backend
elif [[ $UPDATE ]]; then
update_backend
else
echo "Invalid argument: $key"
exit 1
fi
+494
View File
@@ -0,0 +1,494 @@
{
"openapi": "3.1.0",
"info": {
"title": "Home Automation Backend (Python)",
"description": "Home automation backend with auth, runtime config, Home Assistant integrations, TickTick integration, and SQLite-backed recorders.",
"version": "0.1.0"
},
"paths": {
"/status": {
"get": {
"tags": [
"system"
],
"summary": "Get Status",
"operationId": "get_status_status_get",
"responses": {
"200": {
"description": "Successful Response",
"content": {
"application/json": {
"schema": {
"$ref": "#/components/schemas/StatusResponse"
}
}
}
}
}
}
},
"/login": {
"get": {
"tags": [
"auth"
],
"summary": "Login Page",
"operationId": "login_page_login_get",
"responses": {
"200": {
"description": "Successful Response",
"content": {
"text/html": {
"schema": {
"type": "string"
}
}
}
}
}
},
"post": {
"tags": [
"auth"
],
"summary": "Login Submit",
"operationId": "login_submit_login_post",
"requestBody": {
"content": {
"application/x-www-form-urlencoded": {
"schema": {
"$ref": "#/components/schemas/Body_login_submit_login_post"
}
}
},
"required": true
},
"responses": {
"200": {
"description": "Successful Response",
"content": {
"text/html": {
"schema": {
"type": "string"
}
}
}
},
"422": {
"description": "Validation Error",
"content": {
"application/json": {
"schema": {
"$ref": "#/components/schemas/HTTPValidationError"
}
}
}
}
}
}
},
"/config/change-password": {
"post": {
"tags": [
"auth"
],
"summary": "Change Password Submit",
"operationId": "change_password_submit_config_change_password_post",
"requestBody": {
"content": {
"application/x-www-form-urlencoded": {
"schema": {
"$ref": "#/components/schemas/Body_change_password_submit_config_change_password_post"
}
}
},
"required": true
},
"responses": {
"200": {
"description": "Successful Response",
"content": {
"text/html": {
"schema": {
"type": "string"
}
}
}
},
"422": {
"description": "Validation Error",
"content": {
"application/json": {
"schema": {
"$ref": "#/components/schemas/HTTPValidationError"
}
}
}
}
}
}
},
"/logout": {
"post": {
"tags": [
"auth"
],
"summary": "Logout",
"operationId": "logout_logout_post",
"requestBody": {
"content": {
"application/x-www-form-urlencoded": {
"schema": {
"$ref": "#/components/schemas/Body_logout_logout_post"
}
}
},
"required": true
},
"responses": {
"200": {
"description": "Successful Response",
"content": {
"application/json": {
"schema": {}
}
}
},
"422": {
"description": "Validation Error",
"content": {
"application/json": {
"schema": {
"$ref": "#/components/schemas/HTTPValidationError"
}
}
}
}
}
}
},
"/": {
"get": {
"tags": [
"pages"
],
"summary": "Home",
"operationId": "home__get",
"responses": {
"200": {
"description": "Successful Response",
"content": {
"text/html": {
"schema": {
"type": "string"
}
}
}
}
}
}
},
"/admin": {
"get": {
"tags": [
"pages"
],
"summary": "Admin Redirect",
"operationId": "admin_redirect_admin_get",
"responses": {
"200": {
"description": "Successful Response",
"content": {
"text/html": {
"schema": {
"type": "string"
}
}
}
}
}
}
},
"/config": {
"get": {
"tags": [
"pages"
],
"summary": "Config Page",
"operationId": "config_page_config_get",
"responses": {
"200": {
"description": "Successful Response",
"content": {
"text/html": {
"schema": {
"type": "string"
}
}
}
}
}
},
"post": {
"tags": [
"pages"
],
"summary": "Config Submit",
"operationId": "config_submit_config_post",
"responses": {
"200": {
"description": "Successful Response",
"content": {
"text/html": {
"schema": {
"type": "string"
}
}
}
}
}
}
},
"/homeassistant/publish": {
"post": {
"tags": [
"homeassistant"
],
"summary": "Publish From Homeassistant",
"operationId": "publish_from_homeassistant_homeassistant_publish_post",
"responses": {
"200": {
"description": "Successful Response",
"content": {
"application/json": {
"schema": {}
}
}
}
}
}
},
"/location/record": {
"post": {
"tags": [
"location"
],
"summary": "Create Location Record",
"operationId": "create_location_record_location_record_post",
"responses": {
"200": {
"description": "Successful Response",
"content": {
"application/json": {
"schema": {}
}
}
}
}
}
},
"/poo/record": {
"post": {
"tags": [
"poo"
],
"summary": "Create Poo Record",
"operationId": "create_poo_record_poo_record_post",
"responses": {
"200": {
"description": "Successful Response",
"content": {
"application/json": {
"schema": {}
}
}
}
}
}
},
"/poo/latest": {
"get": {
"tags": [
"poo"
],
"summary": "Notify Latest Poo",
"operationId": "notify_latest_poo_poo_latest_get",
"responses": {
"200": {
"description": "Successful Response",
"content": {
"application/json": {
"schema": {}
}
}
}
}
}
},
"/ticktick/auth/start": {
"get": {
"tags": [
"ticktick"
],
"summary": "Start Ticktick Auth",
"operationId": "start_ticktick_auth_ticktick_auth_start_get",
"responses": {
"200": {
"description": "Successful Response",
"content": {
"application/json": {
"schema": {}
}
}
}
}
}
},
"/ticktick/auth/code": {
"get": {
"tags": [
"ticktick"
],
"summary": "Handle Ticktick Auth Code",
"operationId": "handle_ticktick_auth_code_ticktick_auth_code_get",
"responses": {
"200": {
"description": "Successful Response",
"content": {
"application/json": {
"schema": {}
}
}
}
}
}
}
},
"components": {
"schemas": {
"Body_change_password_submit_config_change_password_post": {
"properties": {
"current_password": {
"type": "string",
"title": "Current Password"
},
"new_password": {
"type": "string",
"title": "New Password"
},
"confirm_password": {
"type": "string",
"title": "Confirm Password"
},
"csrf_token": {
"type": "string",
"title": "Csrf Token"
}
},
"type": "object",
"required": [
"current_password",
"new_password",
"confirm_password",
"csrf_token"
],
"title": "Body_change_password_submit_config_change_password_post"
},
"Body_login_submit_login_post": {
"properties": {
"username": {
"type": "string",
"title": "Username"
},
"password": {
"type": "string",
"title": "Password"
},
"csrf_token": {
"type": "string",
"title": "Csrf Token"
}
},
"type": "object",
"required": [
"username",
"password",
"csrf_token"
],
"title": "Body_login_submit_login_post"
},
"Body_logout_logout_post": {
"properties": {
"csrf_token": {
"type": "string",
"title": "Csrf Token"
}
},
"type": "object",
"required": [
"csrf_token"
],
"title": "Body_logout_logout_post"
},
"HTTPValidationError": {
"properties": {
"detail": {
"items": {
"$ref": "#/components/schemas/ValidationError"
},
"type": "array",
"title": "Detail"
}
},
"type": "object",
"title": "HTTPValidationError"
},
"StatusResponse": {
"properties": {
"status": {
"type": "string",
"title": "Status"
}
},
"type": "object",
"required": [
"status"
],
"title": "StatusResponse"
},
"ValidationError": {
"properties": {
"loc": {
"items": {
"anyOf": [
{
"type": "string"
},
{
"type": "integer"
}
]
},
"type": "array",
"title": "Location"
},
"msg": {
"type": "string",
"title": "Message"
},
"type": {
"type": "string",
"title": "Error Type"
}
},
"type": "object",
"required": [
"loc",
"msg",
"type"
],
"title": "ValidationError"
}
}
}
}
+317
View File
@@ -0,0 +1,317 @@
openapi: 3.1.0
info:
title: Home Automation Backend (Python)
description: Home automation backend with auth, runtime config, Home Assistant integrations,
TickTick integration, and SQLite-backed recorders.
version: 0.1.0
paths:
/status:
get:
tags:
- system
summary: Get Status
operationId: get_status_status_get
responses:
'200':
description: Successful Response
content:
application/json:
schema:
$ref: '#/components/schemas/StatusResponse'
/login:
get:
tags:
- auth
summary: Login Page
operationId: login_page_login_get
responses:
'200':
description: Successful Response
content:
text/html:
schema:
type: string
post:
tags:
- auth
summary: Login Submit
operationId: login_submit_login_post
requestBody:
content:
application/x-www-form-urlencoded:
schema:
$ref: '#/components/schemas/Body_login_submit_login_post'
required: true
responses:
'200':
description: Successful Response
content:
text/html:
schema:
type: string
'422':
description: Validation Error
content:
application/json:
schema:
$ref: '#/components/schemas/HTTPValidationError'
/config/change-password:
post:
tags:
- auth
summary: Change Password Submit
operationId: change_password_submit_config_change_password_post
requestBody:
content:
application/x-www-form-urlencoded:
schema:
$ref: '#/components/schemas/Body_change_password_submit_config_change_password_post'
required: true
responses:
'200':
description: Successful Response
content:
text/html:
schema:
type: string
'422':
description: Validation Error
content:
application/json:
schema:
$ref: '#/components/schemas/HTTPValidationError'
/logout:
post:
tags:
- auth
summary: Logout
operationId: logout_logout_post
requestBody:
content:
application/x-www-form-urlencoded:
schema:
$ref: '#/components/schemas/Body_logout_logout_post'
required: true
responses:
'200':
description: Successful Response
content:
application/json:
schema: {}
'422':
description: Validation Error
content:
application/json:
schema:
$ref: '#/components/schemas/HTTPValidationError'
/:
get:
tags:
- pages
summary: Home
operationId: home__get
responses:
'200':
description: Successful Response
content:
text/html:
schema:
type: string
/admin:
get:
tags:
- pages
summary: Admin Redirect
operationId: admin_redirect_admin_get
responses:
'200':
description: Successful Response
content:
text/html:
schema:
type: string
/config:
get:
tags:
- pages
summary: Config Page
operationId: config_page_config_get
responses:
'200':
description: Successful Response
content:
text/html:
schema:
type: string
post:
tags:
- pages
summary: Config Submit
operationId: config_submit_config_post
responses:
'200':
description: Successful Response
content:
text/html:
schema:
type: string
/homeassistant/publish:
post:
tags:
- homeassistant
summary: Publish From Homeassistant
operationId: publish_from_homeassistant_homeassistant_publish_post
responses:
'200':
description: Successful Response
content:
application/json:
schema: {}
/location/record:
post:
tags:
- location
summary: Create Location Record
operationId: create_location_record_location_record_post
responses:
'200':
description: Successful Response
content:
application/json:
schema: {}
/poo/record:
post:
tags:
- poo
summary: Create Poo Record
operationId: create_poo_record_poo_record_post
responses:
'200':
description: Successful Response
content:
application/json:
schema: {}
/poo/latest:
get:
tags:
- poo
summary: Notify Latest Poo
operationId: notify_latest_poo_poo_latest_get
responses:
'200':
description: Successful Response
content:
application/json:
schema: {}
/ticktick/auth/start:
get:
tags:
- ticktick
summary: Start Ticktick Auth
operationId: start_ticktick_auth_ticktick_auth_start_get
responses:
'200':
description: Successful Response
content:
application/json:
schema: {}
/ticktick/auth/code:
get:
tags:
- ticktick
summary: Handle Ticktick Auth Code
operationId: handle_ticktick_auth_code_ticktick_auth_code_get
responses:
'200':
description: Successful Response
content:
application/json:
schema: {}
components:
schemas:
Body_change_password_submit_config_change_password_post:
properties:
current_password:
type: string
title: Current Password
new_password:
type: string
title: New Password
confirm_password:
type: string
title: Confirm Password
csrf_token:
type: string
title: Csrf Token
type: object
required:
- current_password
- new_password
- confirm_password
- csrf_token
title: Body_change_password_submit_config_change_password_post
Body_login_submit_login_post:
properties:
username:
type: string
title: Username
password:
type: string
title: Password
csrf_token:
type: string
title: Csrf Token
type: object
required:
- username
- password
- csrf_token
title: Body_login_submit_login_post
Body_logout_logout_post:
properties:
csrf_token:
type: string
title: Csrf Token
type: object
required:
- csrf_token
title: Body_logout_logout_post
HTTPValidationError:
properties:
detail:
items:
$ref: '#/components/schemas/ValidationError'
type: array
title: Detail
type: object
title: HTTPValidationError
StatusResponse:
properties:
status:
type: string
title: Status
type: object
required:
- status
title: StatusResponse
ValidationError:
properties:
loc:
items:
anyOf:
- type: string
- type: integer
type: array
title: Location
msg:
type: string
title: Message
type:
type: string
title: Error Type
type: object
required:
- loc
- msg
- type
title: ValidationError
+28
View File
@@ -0,0 +1,28 @@
[build-system]
requires = ["setuptools>=68", "wheel"]
build-backend = "setuptools.build_meta"
[project]
name = "home-automation-python"
version = "0.1.0"
description = "Home automation backend with auth, integrations, and SQLite-backed services."
readme = "README.md"
requires-python = ">=3.11"
[tool.setuptools]
packages = [
"app",
"app.api",
"app.api.routes",
"app.integrations",
"app.models",
"app.schemas",
"app.services",
]
[tool.pytest.ini_options]
testpaths = ["tests"]
pythonpath = ["."]
[tool.ruff]
line-length = 100
+11
View File
@@ -0,0 +1,11 @@
alembic>=1.14,<2.0
apscheduler>=3.10,<4.0
argon2-cffi>=25.1,<26.0
fastapi>=0.115,<0.116
httpx>=0.28,<1.0
jinja2>=3.1,<4.0
pydantic-settings>=2.6,<3.0
python-multipart>=0.0.12,<1.0
pyyaml>=6.0,<7.0
sqlalchemy>=2.0,<3.0
uvicorn[standard]>=0.32,<1.0
+103
View File
@@ -0,0 +1,103 @@
#
# This file is autogenerated by pip-compile with Python 3.13
# by the following command:
#
# pip-compile requirements.in
#
alembic==1.18.4
# via -r requirements.in
annotated-types==0.7.0
# via pydantic
anyio==4.13.0
# via
# httpx
# starlette
# watchfiles
apscheduler==3.11.2
# via -r requirements.in
argon2-cffi==25.1.0
# via -r requirements.in
argon2-cffi-bindings==25.1.0
# via argon2-cffi
certifi==2026.4.22
# via
# httpcore
# httpx
cffi==2.0.0
# via argon2-cffi-bindings
click==8.3.2
# via uvicorn
fastapi==0.115.14
# via -r requirements.in
greenlet==3.4.0
# via sqlalchemy
h11==0.16.0
# via
# httpcore
# uvicorn
httpcore==1.0.9
# via httpx
httptools==0.7.1
# via uvicorn
httpx==0.28.1
# via -r requirements.in
idna==3.11
# via
# anyio
# httpx
jinja2==3.1.6
# via -r requirements.in
mako==1.3.11
# via alembic
markupsafe==3.0.3
# via
# jinja2
# mako
pycparser==2.23
# via cffi
pydantic==2.13.2
# via
# fastapi
# pydantic-settings
pydantic-core==2.46.2
# via pydantic
pydantic-settings==2.13.1
# via -r requirements.in
python-dotenv==1.2.2
# via
# pydantic-settings
# uvicorn
python-multipart==0.0.26
# via -r requirements.in
pyyaml==6.0.3
# via
# -r requirements.in
# uvicorn
sqlalchemy==2.0.49
# via
# -r requirements.in
# alembic
starlette==0.46.2
# via fastapi
typing-extensions==4.15.0
# via
# alembic
# fastapi
# pydantic
# pydantic-core
# sqlalchemy
# typing-inspection
typing-inspection==0.4.2
# via
# pydantic
# pydantic-settings
tzlocal==5.3.1
# via apscheduler
uvicorn[standard]==0.44.0
# via -r requirements.in
uvloop==0.22.1
# via uvicorn
watchfiles==1.1.1
# via uvicorn
websockets==16.0
# via uvicorn
+1
View File
@@ -0,0 +1 @@
"""Project helper scripts."""
+162
View File
@@ -0,0 +1,162 @@
from __future__ import annotations
import sqlite3
import sys
from pathlib import Path
from alembic import command
from alembic.config import Config
from alembic.script import ScriptDirectory
from alembic.util.exc import CommandError
PROJECT_ROOT = Path(__file__).resolve().parents[1]
if str(PROJECT_ROOT) not in sys.path:
sys.path.insert(0, str(PROJECT_ROOT))
from app.config import get_settings
APP_BASELINE_REVISION = "20260429_05_public_ip_monitor"
class AppDatabaseAdoptionError(RuntimeError):
"""Raised when the app database is missing or not managed as expected."""
def _database_path_from_url(database_url: str) -> Path:
prefix = "sqlite:///"
if not database_url.startswith(prefix):
raise AppDatabaseAdoptionError(
f"Only sqlite URLs are supported for app DB initialization, got: {database_url}"
)
return Path(database_url[len(prefix) :])
def _make_alembic_config(database_url: str) -> Config:
config = Config("alembic_app.ini")
config.set_main_option("sqlalchemy.url", database_url)
return config
def _expected_head_revision(alembic_config: Config) -> str:
script = ScriptDirectory.from_config(alembic_config)
heads = script.get_heads()
if len(heads) != 1:
raise AppDatabaseAdoptionError(
f"Expected exactly one Alembic head for app DB, got {len(heads)}"
)
return heads[0]
def _is_known_revision(alembic_config: Config, revision: str) -> bool:
script = ScriptDirectory.from_config(alembic_config)
try:
return script.get_revision(revision) is not None
except CommandError:
return False
def _alembic_version_table_exists(database_path: Path) -> bool:
conn = sqlite3.connect(database_path)
try:
row = conn.execute(
"SELECT 1 FROM sqlite_master WHERE type = 'table' AND name = 'alembic_version'"
).fetchone()
return row is not None
finally:
conn.close()
def _fetch_alembic_revision(database_path: Path) -> str:
conn = sqlite3.connect(database_path)
try:
row = conn.execute("SELECT version_num FROM alembic_version").fetchone()
if row is None:
raise AppDatabaseAdoptionError("Alembic version table exists but contains no revision")
return row[0]
finally:
conn.close()
def _list_user_tables(database_path: Path) -> list[str]:
conn = sqlite3.connect(database_path)
try:
rows = conn.execute(
"""
SELECT name
FROM sqlite_master
WHERE type = 'table'
AND name NOT LIKE 'sqlite_%'
"""
).fetchall()
return sorted(row[0] for row in rows)
finally:
conn.close()
def validate_app_runtime_db(database_url: str) -> None:
database_path = _database_path_from_url(database_url)
alembic_config = _make_alembic_config(database_url)
expected_revision = _expected_head_revision(alembic_config)
if not database_path.exists():
raise AppDatabaseAdoptionError(
"App DB file was not found. Run 'python scripts/app_db_adopt.py' first to "
"initialize the app DB before starting the app."
)
if not _alembic_version_table_exists(database_path):
raise AppDatabaseAdoptionError(
"App DB exists but is not yet Alembic-managed. Run "
"'python scripts/app_db_adopt.py' first before starting the app."
)
current_revision = _fetch_alembic_revision(database_path)
if current_revision != expected_revision:
raise AppDatabaseAdoptionError(
"App DB revision mismatch. Refusing to start the app: "
f"expected {expected_revision}, got {current_revision}"
)
def adopt_or_initialize_app_db(database_url: str) -> str:
database_path = _database_path_from_url(database_url)
alembic_config = _make_alembic_config(database_url)
expected_revision = _expected_head_revision(alembic_config)
if database_path.exists():
if _alembic_version_table_exists(database_path):
current_revision = _fetch_alembic_revision(database_path)
if current_revision == expected_revision:
return "already_managed"
if not _is_known_revision(alembic_config, current_revision):
raise AppDatabaseAdoptionError(
"App DB is already Alembic-managed but revision does not match "
f"a known migration revision: got {current_revision}"
)
command.upgrade(alembic_config, "head")
return "upgraded"
existing_tables = _list_user_tables(database_path)
if existing_tables:
raise AppDatabaseAdoptionError(
"App DB exists with unmanaged tables. Refusing to continue because there is "
"no legacy app DB adoption path in this revision."
)
database_path.parent.mkdir(parents=True, exist_ok=True)
command.upgrade(alembic_config, "head")
return "initialized"
def main() -> None:
settings = get_settings()
result = adopt_or_initialize_app_db(settings.app_database_url)
if result == "initialized":
print("Initialized a new app DB via Alembic upgrade head.")
elif result == "upgraded":
print("Upgraded existing app DB to the expected Alembic head revision.")
else:
print("App DB is already Alembic-managed at the expected baseline revision.")
if __name__ == "__main__":
main()

Some files were not shown because too many files have changed in this diff Show More