8 Commits

Author SHA1 Message Date
tliu93 a24e402d47 add grafana provisioning
pytest / test (push) Successful in 46s
2026-04-23 00:12:51 +02:00
tliu93 8565534b73 Merge pull request 'fix ci test' (#5) from feature/add_separate_migration_container into main
pytest / test (push) Successful in 45s
docker-image / build-and-push (push) Successful in 3m40s
Reviewed-on: #5
2026-04-22 13:35:40 +02:00
tliu93 4acdd2dc60 fix ci test
pytest / test (push) Successful in 45s
pytest / test (pull_request) Successful in 44s
2026-04-22 13:31:26 +02:00
tliu93 c9af7530e5 Merge pull request 'change adoption to separate step' (#4) from feature/add_separate_migration_container into main
pytest / test (push) Failing after 44s
docker-image / build-and-push (push) Successful in 3m40s
Reviewed-on: #4
2026-04-22 13:28:30 +02:00
tliu93 a76d6bfb71 change adoption to separate step
pytest / test (push) Failing after 46s
pytest / test (pull_request) Failing after 45s
2026-04-22 13:28:00 +02:00
tliu93 35aee79d93 Restore legacy poo inbound dispatch
pytest / test (push) Successful in 43s
docker-image / build-and-push (push) Successful in 3m38s
2026-04-20 23:33:57 +02:00
tliu93 b9e7f51d51 Split compose dev build from registry deploy
pytest / test (push) Successful in 44s
2026-04-20 23:16:13 +02:00
tliu93 94747c75dd Align image publishing with repository path
pytest / test (push) Successful in 43s
docker-image / build-and-push (push) Successful in 3m37s
2026-04-20 23:05:27 +02:00
21 changed files with 1235 additions and 35 deletions
+7 -3
View File
@@ -5,6 +5,10 @@ on:
tags:
- "v*"
env:
REGISTRY_HOST: code.wanderingbadger.dev
IMAGE_NAME: ${{ github.repository }}
jobs:
build-and-push:
runs-on: ubuntu-latest
@@ -24,7 +28,7 @@ jobs:
- name: Log in to Gitea Container Registry
uses: docker/login-action@v3
with:
registry: code.wanderingbadger.dev
registry: ${{ env.REGISTRY_HOST }}
username: ${{ secrets.REGISTRY_USERNAME }}
password: ${{ secrets.REGISTRY_TOKEN }}
@@ -35,5 +39,5 @@ jobs:
platforms: linux/amd64,linux/arm64
push: true
tags: |
code.wanderingbadger.dev/tliu93/home-automation:${{ github.ref_name }}
code.wanderingbadger.dev/tliu93/home-automation:latest
${{ env.REGISTRY_HOST }}/${{ env.IMAGE_NAME }}:${{ github.ref_name }}
${{ env.REGISTRY_HOST }}/${{ env.IMAGE_NAME }}:latest
+1
View File
@@ -23,3 +23,4 @@ RUN mkdir -p /app/data
EXPOSE 8000
ENTRYPOINT ["/app/docker/entrypoint.sh"]
CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"]
+77 -5
View File
@@ -107,9 +107,7 @@ cp .env.example .env
3. 初始化数据库
```bash
python scripts/app_db_adopt.py
python scripts/location_db_adopt.py
python scripts/poo_db_adopt.py
python -m scripts.run_migrations
```
4. 启动服务
@@ -141,6 +139,7 @@ uvicorn app.main:app --reload --host 0.0.0.0 --port 8000
- App Alembic 环境:`alembic_app.ini` + `alembic_app/`
- Location Alembic 环境:`alembic_location.ini` + `alembic_location/`
- Poo Alembic 环境:`alembic_poo.ini` + `alembic_poo/`
- 统一 migration job`python -m scripts.run_migrations`
- App DB 初始化:`python scripts/app_db_adopt.py`
- Location DB 接管 / 初始化:`python scripts/location_db_adopt.py`
- Poo DB 接管 / 初始化:`python scripts/poo_db_adopt.py`
@@ -217,18 +216,81 @@ python scripts/export_openapi.py
当前默认 Compose 服务名为 `app`,容器名固定为 `home-automation-app`
启动方式
当前 Compose 分成两层
- `docker-compose.yml`:默认使用 registry image,适合部署 / 生产拉取
- `docker-compose.override.yml`:仅为本地开发追加 `build: .`
本地开发启动方式:
```bash
docker compose up -d --build
```
上面的命令会自动叠加 `docker-compose.override.yml`,因此本地仍然会按当前工作目录重新 build。
如果要按生产方式直接从 registry 拉取并启动,显式只使用基础 compose 文件:
```bash
docker compose -f docker-compose.yml pull
docker compose -f docker-compose.yml up -d
```
持续查看日志:
```bash
docker compose logs -f app
```
## Grafana Provisioning
当前仓库支持通过 Grafana provisioning 自动加载 SQLite datasource 和 repo 内的 dashboard 导出文件。
需要保留的文件路径如下:
- `grafana/provisioning/datasources/locationrecorder.yaml`
- `grafana/provisioning/datasources/poorecorder.yaml`
- `grafana/provisioning/dashboards/provider.yaml`
- `grafana/dashboards/locationrecorder.json`
- `grafana/dashboards/poorecorder.json`
这些文件的职责分别是:
- `grafana/provisioning/datasources/locationrecorder.yaml`:声明 `locationrecorder` SQLite datasource,并指向 `/data/home-automation/locationRecorder.db`
- `grafana/provisioning/datasources/poorecorder.yaml`:声明 `poorecorder` SQLite datasource,并指向 `/data/home-automation/pooRecorder.db`
- `grafana/provisioning/dashboards/provider.yaml`:告诉 Grafana 从 `/var/lib/grafana/dashboards` 扫描并加载 dashboard JSON
- `grafana/dashboards/locationrecorder.json`location recorder dashboard 导出文件,内容本身不需要在 compose 中改写
- `grafana/dashboards/poorecorder.json`poo recorder dashboard 导出文件,内容本身不需要在 compose 中改写
当前 `docker-compose.yml` 中,Grafana service 需要挂载以下目录:
- `./grafana/provisioning -> /etc/grafana/provisioning:ro`
- `./grafana/dashboards -> /var/lib/grafana/dashboards:ro`
同时保留现有 named volume `homeautomation_grafana_storage:/var/lib/grafana` 作为 Grafana 运行态数据存储。
一键启动前,至少需要以下文件已经存在:
- `grafana/provisioning/datasources/locationrecorder.yaml`
- `grafana/provisioning/datasources/poorecorder.yaml`
- `grafana/provisioning/dashboards/provider.yaml`
- `grafana/dashboards/locationrecorder.json`
- `grafana/dashboards/poorecorder.json`
启动方式:
```bash
docker compose up -d
```
启动后会发生的事情:
- Grafana 容器会安装 `frser-sqlite-datasource` 插件
- Grafana 会读取 `/etc/grafana/provisioning/datasources/` 下的 datasource YAML
- Grafana 会读取 `/etc/grafana/provisioning/dashboards/provider.yaml`
- Grafana 会从 `/var/lib/grafana/dashboards/` 自动导入两个 dashboard JSON
- 现有 Grafana named volume 继续负责保存 Grafana 运行态数据,不会覆盖 repo 内的 dashboard 与 provisioning 文件
## Container Image CI
项目提供了一个 release image workflow
@@ -236,7 +298,17 @@ docker compose logs -f app
- workflow 文件:`.github/workflows/docker-image.yml`
- 触发条件:push 匹配 `v*` 的 tag,例如 `v1.0.0`
- registry`code.wanderingbadger.dev`
- image`code.wanderingbadger.dev/tliu93/home-automation`
- image`code.wanderingbadger.dev/<owner>/<repo>`
`docker-compose.yml` 中生产默认使用的 app image 当前为:
- `code.wanderingbadger.dev/tliu93/home-automation:latest`
当前 workflow 不再把 image name 硬编码到特定 user package 路径,而是直接使用当前仓库标识生成镜像路径:
- `code.wanderingbadger.dev/${github.repository}:${tag}`
在 Gitea 这里,package 更贴近 repo 归属的语义,主要体现在镜像命名路径本身,而不是额外的“绑定”动作。也就是说,当前发布方式是按仓库路径约定来对齐 repo/package 语义。
这个 workflow 会构建并推送 multi-arch image
+32 -4
View File
@@ -6,7 +6,19 @@ from fastapi.responses import PlainTextResponse, Response
from pydantic import ValidationError
from sqlalchemy.orm import Session
from app.dependencies import get_db, get_ticktick_client
from app.config import Settings
from app.dependencies import (
get_app_settings,
get_db,
get_homeassistant_client,
get_poo_db,
get_ticktick_client,
)
from app.integrations.homeassistant import (
HomeAssistantClient,
HomeAssistantConfigError,
HomeAssistantRequestError,
)
from app.integrations.ticktick import TickTickClient, TickTickConfigError, TickTickRequestError
from app.schemas.homeassistant import HomeAssistantPublishEnvelope
from app.services.homeassistant_inbound import (
@@ -24,13 +36,23 @@ INTERNAL_SERVER_ERROR_MESSAGE = "internal server error"
async def publish_from_homeassistant(
request: Request,
db: Session = Depends(get_db),
poo_db: Session = Depends(get_poo_db),
settings: Settings = Depends(get_app_settings),
homeassistant_client: HomeAssistantClient = Depends(get_homeassistant_client),
ticktick_client: TickTickClient = Depends(get_ticktick_client),
) -> Response:
try:
raw_payload = await request.body()
data = json.loads(raw_payload)
envelope = HomeAssistantPublishEnvelope.model_validate(data)
handle_homeassistant_message(db, envelope, ticktick_client)
handle_homeassistant_message(
db,
envelope,
ticktick_client=ticktick_client,
poo_session=poo_db,
settings=settings,
homeassistant_client=homeassistant_client,
)
except json.JSONDecodeError as exc:
logger.warning("Rejected Home Assistant publish request due to invalid JSON: %s", exc)
return PlainTextResponse(BAD_REQUEST_MESSAGE, status_code=status.HTTP_400_BAD_REQUEST)
@@ -45,8 +67,14 @@ async def publish_from_homeassistant(
INTERNAL_SERVER_ERROR_MESSAGE,
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
)
except (TickTickConfigError, TickTickRequestError, RuntimeError) as exc:
logger.warning("Home Assistant publish request failed during TickTick handling: %s", exc)
except (
TickTickConfigError,
TickTickRequestError,
HomeAssistantConfigError,
HomeAssistantRequestError,
RuntimeError,
) as exc:
logger.warning("Home Assistant publish request failed during integration handling: %s", exc)
return PlainTextResponse(
INTERNAL_SERVER_ERROR_MESSAGE,
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
+37
View File
@@ -4,11 +4,14 @@ import json
from datetime import UTC, datetime, time, timedelta
from sqlalchemy.orm import Session
from app.config import Settings
from app.integrations.homeassistant import HomeAssistantClient
from app.integrations.ticktick import TICKTICK_DATETIME_FORMAT, TickTickClient, TickTickTask
from app.schemas.homeassistant import HomeAssistantPublishEnvelope
from app.schemas.location import LocationRecordRequest
from app.schemas.ticktick import TickTickActionTaskRequest
from app.services.location import record_location
from app.services.poo import publish_latest_poo_status
class UnsupportedHomeAssistantMessage(RuntimeError):
@@ -19,11 +22,23 @@ def handle_homeassistant_message(
session: Session,
envelope: HomeAssistantPublishEnvelope,
ticktick_client: TickTickClient | None = None,
poo_session: Session | None = None,
settings: Settings | None = None,
homeassistant_client: HomeAssistantClient | None = None,
) -> None:
if envelope.target == "location_recorder":
_handle_location_message(session, envelope)
return
if envelope.target == "poo_recorder":
_handle_poo_message(
envelope,
poo_session=poo_session,
settings=settings,
homeassistant_client=homeassistant_client,
)
return
if envelope.target == "ticktick":
_handle_ticktick_message(envelope, ticktick_client)
return
@@ -44,6 +59,28 @@ def _handle_location_message(session: Session, envelope: HomeAssistantPublishEnv
record_location(session, payload)
def _handle_poo_message(
envelope: HomeAssistantPublishEnvelope,
*,
poo_session: Session | None,
settings: Settings | None,
homeassistant_client: HomeAssistantClient | None,
) -> None:
if envelope.action != "get_latest":
raise UnsupportedHomeAssistantMessage(
f"Unsupported Home Assistant target/action: {envelope.target}/{envelope.action}"
)
if poo_session is None or settings is None or homeassistant_client is None:
raise RuntimeError("Poo recorder integration is unavailable")
publish_latest_poo_status(
session=poo_session,
settings=settings,
homeassistant_client=homeassistant_client,
)
def _handle_ticktick_message(
envelope: HomeAssistantPublishEnvelope,
ticktick_client: TickTickClient | None,
+6
View File
@@ -0,0 +1,6 @@
services:
migration:
build: .
app:
build: .
+18 -1
View File
@@ -1,10 +1,24 @@
services:
migration:
container_name: home-automation-migration
image: code.wanderingbadger.dev/tliu93/home-automation:latest
user: "1000:1000"
restart: "no"
init: true
command: ["python", "-m", "scripts.run_migrations"]
volumes:
- ./data:/app/data
- ./.env:/app/.env:ro
app:
container_name: home-automation-app
build: .
image: code.wanderingbadger.dev/tliu93/home-automation:latest
user: "1000:1000"
restart: unless-stopped
init: true
depends_on:
migration:
condition: service_completed_successfully
ports:
- "127.0.0.1:8881:8000"
volumes:
@@ -23,7 +37,10 @@ services:
GF_PLUGINS_PREINSTALL: frser-sqlite-datasource
volumes:
- ./data:/data/home-automation:ro
- ./grafana/provisioning:/etc/grafana/provisioning:ro
- ./grafana/dashboards:/var/lib/grafana/dashboards:ro
- homeautomation_grafana_storage:/var/lib/grafana
volumes:
homeautomation_grafana_storage:
name: homeautomation_grafana_storage
+1 -5
View File
@@ -2,8 +2,4 @@
set -eu
python scripts/app_db_adopt.py
python scripts/location_db_adopt.py
python scripts/poo_db_adopt.py
exec uvicorn app.main:app --host 0.0.0.0 --port 8000
exec "$@"
+288
View File
@@ -0,0 +1,288 @@
{
"apiVersion": "dashboard.grafana.app/v2",
"kind": "Dashboard",
"metadata": {
"name": "adzr6rv",
"namespace": "default",
"uid": "c5fc57e5-7fb5-4104-9861-023710ada568",
"resourceVersion": "1776634346371016",
"generation": 19,
"creationTimestamp": "2026-04-18T19:05:57Z",
"labels": {
"grafana.app/deprecatedInternalID": "945374452785152"
},
"annotations": {
"grafana.app/createdBy": "user:ffjhknvgkvhtsc",
"grafana.app/folder": "",
"grafana.app/saved-from-ui": "Grafana v13.0.1 (a100054f)",
"grafana.app/updatedBy": "user:ffjhknvgkvhtsc",
"grafana.app/updatedTimestamp": "2026-04-19T21:32:26Z"
}
},
"spec": {
"annotations": [
{
"kind": "AnnotationQuery",
"spec": {
"query": {
"kind": "DataQuery",
"group": "grafana",
"version": "v0",
"datasource": {
"name": "-- Grafana --"
},
"spec": {}
},
"enable": true,
"hide": true,
"iconColor": "rgba(0, 211, 255, 1)",
"name": "Annotations & Alerts",
"builtIn": true
}
}
],
"cursorSync": "Off",
"editable": true,
"elements": {
"panel-1": {
"kind": "Panel",
"spec": {
"id": 1,
"title": "轨迹",
"description": "",
"links": [],
"data": {
"kind": "QueryGroup",
"spec": {
"queries": [
{
"kind": "PanelQuery",
"spec": {
"query": {
"kind": "DataQuery",
"group": "frser-sqlite-datasource",
"version": "v0",
"datasource": {
"name": "ffjhr941d5iwwf"
},
"spec": {
"queryText": "SELECT\n datetime AS time,\n latitude,\n longitude,\n altitude\nFROM location\nWHERE person = 'Jiangxue'\n AND datetime >= '2021-04-19T21:29:57.036Z'\n AND datetime <= '2026-04-19T21:29:57.036Z'\n AND latitude != 0\n AND longitude != 0\nORDER BY datetime;\n",
"queryType": "table",
"rawQueryText": "SELECT\n datetime AS time,\n latitude,\n longitude,\n altitude\nFROM location\nWHERE person = '$person'\n AND datetime >= '${__from:date:iso}'\n AND datetime <= '${__to:date:iso}'\n AND latitude != 0\n AND longitude != 0\nORDER BY datetime;\n",
"timeColumns": [
"time",
"ts"
]
}
},
"refId": "A",
"hidden": false
}
}
],
"transformations": [],
"queryOptions": {}
}
},
"vizConfig": {
"kind": "VizConfig",
"group": "geomap",
"version": "13.0.1",
"spec": {
"options": {
"basemap": {
"config": {
"server": "streets"
},
"name": "Layer 0",
"noRepeat": false,
"type": "default"
},
"controls": {
"mouseWheelZoom": true,
"showAttribution": true,
"showDebug": false,
"showMeasure": false,
"showScale": false,
"showZoom": true
},
"layers": [
{
"config": {
"showLegend": false,
"style": {
"color": {
"fixed": "blue"
},
"opacity": 0.7,
"rotation": {
"fixed": 0,
"max": 360,
"min": -360,
"mode": "mod"
},
"size": {
"fixed": 3,
"max": 15,
"min": 2
},
"symbol": {
"fixed": "img/icons/marker/circle.svg",
"mode": "fixed"
},
"symbolAlign": {
"horizontal": "center",
"vertical": "center"
},
"textConfig": {
"fontSize": 12,
"offsetX": 0,
"offsetY": 0,
"textAlign": "center",
"textBaseline": "middle"
}
}
},
"layer-tooltip": true,
"name": "path",
"tooltip": true,
"type": "markers"
}
],
"tooltip": {
"mode": "details"
},
"view": {
"allLayers": true,
"dashboardVariable": false,
"id": "fit",
"lat": 0,
"lon": 0,
"noRepeat": false,
"shared": false,
"zoom": 15
}
},
"fieldConfig": {
"defaults": {
"thresholds": {
"mode": "absolute",
"steps": [
{
"value": 0,
"color": "green"
}
]
},
"color": {
"mode": "thresholds"
},
"custom": {
"hideFrom": {
"legend": false,
"tooltip": false,
"viz": false
}
}
},
"overrides": []
}
}
}
}
}
},
"layout": {
"kind": "GridLayout",
"spec": {
"items": [
{
"kind": "GridLayoutItem",
"spec": {
"x": 0,
"y": 0,
"width": 24,
"height": 18,
"element": {
"kind": "ElementReference",
"name": "panel-1"
}
}
}
]
}
},
"links": [],
"liveNow": false,
"preload": false,
"tags": [],
"timeSettings": {
"timezone": "browser",
"from": "now-5y",
"to": "now",
"autoRefresh": "",
"autoRefreshIntervals": [
"5s",
"10s",
"30s",
"1m",
"5m",
"15m",
"30m",
"1h",
"2h",
"1d"
],
"hideTimepicker": false,
"fiscalYearStartMonth": 0
},
"title": "轨迹",
"variables": [
{
"kind": "QueryVariable",
"spec": {
"name": "person",
"current": {
"text": "Jiangxue",
"value": "Jiangxue"
},
"label": "person",
"hide": "dontHide",
"refresh": "onDashboardLoad",
"skipUrlSync": false,
"description": "",
"query": {
"kind": "DataQuery",
"group": "frser-sqlite-datasource",
"version": "v0",
"datasource": {
"name": "ffjhr941d5iwwf"
},
"spec": {
"__legacyStringValue": "SELECT DISTINCT person\nFROM location\nORDER BY person;\n"
}
},
"regex": "",
"regexApplyTo": "value",
"sort": "disabled",
"definition": "SELECT DISTINCT person\nFROM location\nORDER BY person;\n",
"options": [],
"multi": false,
"includeAll": false,
"allowCustomValue": true
}
}
],
"preferences": {
"layout": {
"kind": "AutoGridLayout",
"spec": {
"maxColumnCount": 3,
"columnWidthMode": "standard",
"rowHeightMode": "standard",
"items": []
}
}
}
}
}
+231
View File
@@ -0,0 +1,231 @@
{
"apiVersion": "dashboard.grafana.app/v2",
"kind": "Dashboard",
"metadata": {
"name": "adl5sjt",
"namespace": "default",
"uid": "d4c72406-9fc5-4b85-844b-be1250f1fa8b",
"resourceVersion": "1776606363367013",
"generation": 6,
"creationTimestamp": "2026-04-18T20:07:34Z",
"labels": {
"grafana.app/deprecatedInternalID": "960882027798528"
},
"annotations": {
"grafana.app/createdBy": "user:ffjhknvgkvhtsc",
"grafana.app/folder": "",
"grafana.app/saved-from-ui": "Grafana v13.0.1 (a100054f)",
"grafana.app/updatedBy": "user:ffjhknvgkvhtsc",
"grafana.app/updatedTimestamp": "2026-04-19T13:46:03Z"
}
},
"spec": {
"annotations": [
{
"kind": "AnnotationQuery",
"spec": {
"query": {
"kind": "DataQuery",
"group": "grafana",
"version": "v0",
"datasource": {
"name": "-- Grafana --"
},
"spec": {}
},
"enable": true,
"hide": true,
"iconColor": "rgba(0, 211, 255, 1)",
"name": "Annotations & Alerts",
"builtIn": true
}
}
],
"cursorSync": "Off",
"editable": true,
"elements": {
"panel-1": {
"kind": "Panel",
"spec": {
"id": 1,
"title": "Mika Poo",
"description": "Mika's poo",
"links": [],
"data": {
"kind": "QueryGroup",
"spec": {
"queries": [
{
"kind": "PanelQuery",
"spec": {
"query": {
"kind": "DataQuery",
"group": "frser-sqlite-datasource",
"version": "v0",
"datasource": {
"name": "ffjhkuu4hc3y8e"
},
"spec": {
"queryText": "SELECT\n latitude,\n longitude,\n timestamp\nFROM poo_records\nWHERE timestamp >= '${__from:date:iso}'\n AND timestamp <= '${__to:date:iso}'\n AND latitude != 0\n AND longitude != 0\nORDER BY timestamp;\n",
"queryType": "table",
"rawQueryText": "SELECT\n latitude,\n longitude,\n timestamp\nFROM poo_records\nWHERE timestamp >= '${__from:date:iso}'\n AND timestamp <= '${__to:date:iso}'\n AND latitude != 0\n AND longitude != 0\nORDER BY timestamp;\n",
"timeColumns": [
"time",
"ts"
]
}
},
"refId": "A",
"hidden": false
}
}
],
"transformations": [],
"queryOptions": {}
}
},
"vizConfig": {
"kind": "VizConfig",
"group": "geomap",
"version": "13.0.1",
"spec": {
"options": {
"basemap": {
"config": {},
"name": "Layer 0",
"noRepeat": false,
"type": "default"
},
"controls": {
"mouseWheelZoom": true,
"showAttribution": true,
"showDebug": false,
"showMeasure": false,
"showScale": false,
"showZoom": true
},
"layers": [
{
"config": {
"blur": 15,
"radius": 5,
"weight": {
"fixed": 1,
"max": 1,
"min": 0
}
},
"filterData": {
"id": "byRefId",
"options": "A"
},
"location": {
"mode": "auto"
},
"name": "Poo",
"tooltip": true,
"type": "heatmap"
}
],
"tooltip": {
"mode": "details"
},
"view": {
"allLayers": true,
"dashboardVariable": false,
"id": "zero",
"lat": 0,
"lon": 0,
"noRepeat": false,
"zoom": 1
}
},
"fieldConfig": {
"defaults": {
"thresholds": {
"mode": "absolute",
"steps": [
{
"value": 0,
"color": "green"
},
{
"value": 80,
"color": "red"
}
]
},
"color": {
"mode": "thresholds"
},
"custom": {
"hideFrom": {
"legend": false,
"tooltip": false,
"viz": false
}
}
},
"overrides": []
}
}
}
}
}
},
"layout": {
"kind": "GridLayout",
"spec": {
"items": [
{
"kind": "GridLayoutItem",
"spec": {
"x": 0,
"y": 0,
"width": 24,
"height": 19,
"element": {
"kind": "ElementReference",
"name": "panel-1"
}
}
}
]
}
},
"links": [],
"liveNow": false,
"preload": false,
"tags": [],
"timeSettings": {
"timezone": "browser",
"from": "now-5y",
"to": "now",
"autoRefresh": "",
"autoRefreshIntervals": [
"5s",
"10s",
"30s",
"1m",
"5m",
"15m",
"30m",
"1h",
"2h",
"1d"
],
"hideTimepicker": false,
"fiscalYearStartMonth": 0
},
"title": "Mika Poo",
"variables": [],
"preferences": {
"layout": {
"kind": "GridLayout",
"spec": {
"items": []
}
}
}
}
}
@@ -0,0 +1,13 @@
apiVersion: 1
providers:
- name: home-automation-dashboards
orgId: 1
folder: ""
type: file
disableDeletion: false
allowUiUpdates: false
updateIntervalSeconds: 30
options:
path: /var/lib/grafana/dashboards
foldersFromFilesStructure: false
@@ -0,0 +1,11 @@
apiVersion: 1
datasources:
- name: locationrecorder
uid: ffjhr941d5iwwf
type: frser-sqlite-datasource
access: proxy
isDefault: false
editable: false
jsonData:
path: /data/home-automation/locationRecorder.db
@@ -0,0 +1,11 @@
apiVersion: 1
datasources:
- name: poorecorder
uid: ffjhkuu4hc3y8e
type: frser-sqlite-datasource
access: proxy
isDefault: false
editable: false
jsonData:
path: /data/home-automation/pooRecorder.db
+31 -3
View File
@@ -6,6 +6,8 @@ from pathlib import Path
from alembic import command
from alembic.config import Config
from alembic.script import ScriptDirectory
from alembic.util.exc import CommandError
PROJECT_ROOT = Path(__file__).resolve().parents[1]
if str(PROJECT_ROOT) not in sys.path:
@@ -35,6 +37,24 @@ def _make_alembic_config(database_url: str) -> Config:
return config
def _expected_head_revision(alembic_config: Config) -> str:
script = ScriptDirectory.from_config(alembic_config)
heads = script.get_heads()
if len(heads) != 1:
raise AppDatabaseAdoptionError(
f"Expected exactly one Alembic head for app DB, got {len(heads)}"
)
return heads[0]
def _is_known_revision(alembic_config: Config, revision: str) -> bool:
script = ScriptDirectory.from_config(alembic_config)
try:
return script.get_revision(revision) is not None
except CommandError:
return False
def _alembic_version_table_exists(database_path: Path) -> bool:
conn = sqlite3.connect(database_path)
try:
@@ -75,6 +95,8 @@ def _list_user_tables(database_path: Path) -> list[str]:
def validate_app_runtime_db(database_url: str) -> None:
database_path = _database_path_from_url(database_url)
alembic_config = _make_alembic_config(database_url)
expected_revision = _expected_head_revision(alembic_config)
if not database_path.exists():
raise AppDatabaseAdoptionError(
"App DB file was not found. Run 'python scripts/app_db_adopt.py' first to "
@@ -88,22 +110,28 @@ def validate_app_runtime_db(database_url: str) -> None:
)
current_revision = _fetch_alembic_revision(database_path)
if current_revision != APP_BASELINE_REVISION:
if current_revision != expected_revision:
raise AppDatabaseAdoptionError(
"App DB revision mismatch. Refusing to start the app: "
f"expected {APP_BASELINE_REVISION}, got {current_revision}"
f"expected {expected_revision}, got {current_revision}"
)
def adopt_or_initialize_app_db(database_url: str) -> str:
database_path = _database_path_from_url(database_url)
alembic_config = _make_alembic_config(database_url)
expected_revision = _expected_head_revision(alembic_config)
if database_path.exists():
if _alembic_version_table_exists(database_path):
current_revision = _fetch_alembic_revision(database_path)
if current_revision == APP_BASELINE_REVISION:
if current_revision == expected_revision:
return "already_managed"
if not _is_known_revision(alembic_config, current_revision):
raise AppDatabaseAdoptionError(
"App DB is already Alembic-managed but revision does not match "
f"a known migration revision: got {current_revision}"
)
command.upgrade(alembic_config, "head")
return "upgraded"
+34 -6
View File
@@ -6,6 +6,8 @@ from pathlib import Path
from alembic import command
from alembic.config import Config
from alembic.script import ScriptDirectory
from alembic.util.exc import CommandError
PROJECT_ROOT = Path(__file__).resolve().parents[1]
if str(PROJECT_ROOT) not in sys.path:
@@ -43,6 +45,24 @@ def _make_alembic_config(database_url: str) -> Config:
return config
def _expected_head_revision(alembic_config: Config) -> str:
script = ScriptDirectory.from_config(alembic_config)
heads = script.get_heads()
if len(heads) != 1:
raise LocationDatabaseAdoptionError(
f"Expected exactly one Alembic head for location DB, got {len(heads)}"
)
return heads[0]
def _is_known_revision(alembic_config: Config, revision: str) -> bool:
script = ScriptDirectory.from_config(alembic_config)
try:
return script.get_revision(revision) is not None
except CommandError:
return False
def _location_table_exists(database_path: Path) -> bool:
conn = sqlite3.connect(database_path)
try:
@@ -117,6 +137,8 @@ def validate_legacy_location_db(database_url: str) -> None:
def validate_location_runtime_db(database_url: str) -> None:
database_path = _database_path_from_url(database_url)
alembic_config = _make_alembic_config(database_url)
expected_revision = _expected_head_revision(alembic_config)
if not database_path.exists():
raise LocationDatabaseAdoptionError(
"Location DB file was not found. Run 'python scripts/location_db_adopt.py' "
@@ -131,30 +153,36 @@ def validate_location_runtime_db(database_url: str) -> None:
)
current_revision = _fetch_alembic_revision(database_path)
if current_revision != LOCATION_BASELINE_REVISION:
if current_revision != expected_revision:
raise LocationDatabaseAdoptionError(
"Location DB revision mismatch. Refusing to start the app: "
f"expected {LOCATION_BASELINE_REVISION}, got {current_revision}"
f"expected {expected_revision}, got {current_revision}"
)
def adopt_or_initialize_location_db(database_url: str) -> str:
database_path = _database_path_from_url(database_url)
alembic_config = _make_alembic_config(database_url)
expected_revision = _expected_head_revision(alembic_config)
if database_path.exists():
if _alembic_version_table_exists(database_path):
current_revision = _fetch_alembic_revision(database_path)
if current_revision != LOCATION_BASELINE_REVISION:
if current_revision == expected_revision:
return "already_managed"
if not _is_known_revision(alembic_config, current_revision):
raise LocationDatabaseAdoptionError(
"Location DB is already Alembic-managed but revision does not match "
f"the expected baseline: expected {LOCATION_BASELINE_REVISION}, "
f"got {current_revision}"
f"a known migration revision: got {current_revision}"
)
return "already_managed"
command.upgrade(alembic_config, "head")
return "upgraded"
validate_legacy_location_db(database_url)
command.stamp(alembic_config, LOCATION_BASELINE_REVISION)
if LOCATION_BASELINE_REVISION != expected_revision:
command.upgrade(alembic_config, "head")
return "upgraded"
return "adopted"
database_path.parent.mkdir(parents=True, exist_ok=True)
+34 -6
View File
@@ -6,6 +6,8 @@ from pathlib import Path
from alembic import command
from alembic.config import Config
from alembic.script import ScriptDirectory
from alembic.util.exc import CommandError
PROJECT_ROOT = Path(__file__).resolve().parents[1]
if str(PROJECT_ROOT) not in sys.path:
@@ -42,6 +44,24 @@ def _make_alembic_config(database_url: str) -> Config:
return config
def _expected_head_revision(alembic_config: Config) -> str:
script = ScriptDirectory.from_config(alembic_config)
heads = script.get_heads()
if len(heads) != 1:
raise PooDatabaseAdoptionError(
f"Expected exactly one Alembic head for poo DB, got {len(heads)}"
)
return heads[0]
def _is_known_revision(alembic_config: Config, revision: str) -> bool:
script = ScriptDirectory.from_config(alembic_config)
try:
return script.get_revision(revision) is not None
except CommandError:
return False
def _poo_table_exists(database_path: Path) -> bool:
conn = sqlite3.connect(database_path)
try:
@@ -112,6 +132,8 @@ def validate_legacy_poo_db(database_url: str) -> None:
def validate_poo_runtime_db(database_url: str) -> None:
database_path = _database_path_from_url(database_url)
alembic_config = _make_alembic_config(database_url)
expected_revision = _expected_head_revision(alembic_config)
if not database_path.exists():
raise PooDatabaseAdoptionError(
"Poo DB file was not found. Run 'python scripts/poo_db_adopt.py' first to "
@@ -126,30 +148,36 @@ def validate_poo_runtime_db(database_url: str) -> None:
)
current_revision = _fetch_alembic_revision(database_path)
if current_revision != POO_BASELINE_REVISION:
if current_revision != expected_revision:
raise PooDatabaseAdoptionError(
"Poo DB revision mismatch. Refusing to start the app: "
f"expected {POO_BASELINE_REVISION}, got {current_revision}"
f"expected {expected_revision}, got {current_revision}"
)
def adopt_or_initialize_poo_db(database_url: str) -> str:
database_path = _database_path_from_url(database_url)
alembic_config = _make_alembic_config(database_url)
expected_revision = _expected_head_revision(alembic_config)
if database_path.exists():
if _alembic_version_table_exists(database_path):
current_revision = _fetch_alembic_revision(database_path)
if current_revision != POO_BASELINE_REVISION:
if current_revision == expected_revision:
return "already_managed"
if not _is_known_revision(alembic_config, current_revision):
raise PooDatabaseAdoptionError(
"Poo DB is already Alembic-managed but revision does not match "
f"the expected baseline: expected {POO_BASELINE_REVISION}, "
f"got {current_revision}"
f"a known migration revision: got {current_revision}"
)
return "already_managed"
command.upgrade(alembic_config, "head")
return "upgraded"
validate_legacy_poo_db(database_url)
command.stamp(alembic_config, POO_BASELINE_REVISION)
if POO_BASELINE_REVISION != expected_revision:
command.upgrade(alembic_config, "head")
return "upgraded"
return "adopted"
database_path.parent.mkdir(parents=True, exist_ok=True)
+25
View File
@@ -0,0 +1,25 @@
from __future__ import annotations
from app.config import get_settings
from scripts.app_db_adopt import adopt_or_initialize_app_db
from scripts.location_db_adopt import adopt_or_initialize_location_db
from scripts.poo_db_adopt import adopt_or_initialize_poo_db
def run_all_migrations() -> dict[str, str]:
settings = get_settings()
return {
"app": adopt_or_initialize_app_db(settings.app_database_url),
"location": adopt_or_initialize_location_db(settings.location_database_url),
"poo": adopt_or_initialize_poo_db(settings.poo_database_url),
}
def main() -> None:
results = run_all_migrations()
for database_name, result in results.items():
print(f"{database_name}: {result}")
if __name__ == "__main__":
main()
+4 -1
View File
@@ -37,12 +37,13 @@ def test_status_endpoint(client: TestClient) -> None:
def test_app_start_fails_when_app_db_missing(tmp_path, monkeypatch: pytest.MonkeyPatch) -> None:
missing_app_path = tmp_path / "missing_app.db"
poo_database_path = tmp_path / "poo_ready.db"
location_database_path = tmp_path / "location_ready.db"
command.upgrade(_make_poo_alembic_config(f"sqlite:///{poo_database_path}"), "head")
command.upgrade(_make_alembic_config(f"sqlite:///{location_database_path}"), "head")
monkeypatch.setenv("APP_DATABASE_URL", f"sqlite:///{tmp_path / 'missing_app.db'}")
monkeypatch.setenv("APP_DATABASE_URL", f"sqlite:///{missing_app_path}")
monkeypatch.setenv("AUTH_BOOTSTRAP_USERNAME", "admin")
monkeypatch.setenv("AUTH_BOOTSTRAP_PASSWORD", "test-password")
monkeypatch.setenv("LOCATION_DATABASE_URL", f"sqlite:///{location_database_path}")
@@ -54,6 +55,8 @@ def test_app_start_fails_when_app_db_missing(tmp_path, monkeypatch: pytest.Monke
with pytest.raises(RuntimeError, match="Run 'python scripts/app_db_adopt.py' first"):
anyio.run(_run_lifespan, app)
assert not missing_app_path.exists()
get_settings.cache_clear()
reset_auth_db_caches()
+215
View File
@@ -0,0 +1,215 @@
from pathlib import Path
import sqlite3
import anyio
import pytest
import yaml
from alembic import command
from app.auth_db import reset_auth_db_caches
from app.config import get_settings
from app.main import create_app
from scripts.app_db_adopt import APP_BASELINE_REVISION
from scripts.location_db_adopt import EXPECTED_USER_VERSION as LOCATION_USER_VERSION
from scripts.location_db_adopt import LOCATION_BASELINE_REVISION
from scripts.poo_db_adopt import EXPECTED_USER_VERSION as POO_USER_VERSION
from scripts.poo_db_adopt import POO_BASELINE_REVISION
from scripts.run_migrations import run_all_migrations
from tests.conftest import _make_alembic_config, _make_poo_alembic_config
PROJECT_ROOT = Path(__file__).resolve().parents[1]
def _read_yaml(path: str) -> dict:
return yaml.safe_load((PROJECT_ROOT / path).read_text())
async def _run_lifespan(app) -> None:
async with app.router.lifespan_context(app):
return None
def _configure_database_env(tmp_path: Path, monkeypatch: pytest.MonkeyPatch) -> dict[str, Path | str]:
app_path = tmp_path / "app.db"
location_path = tmp_path / "location.db"
poo_path = tmp_path / "poo.db"
monkeypatch.setenv("APP_DATABASE_URL", f"sqlite:///{app_path}")
monkeypatch.setenv("LOCATION_DATABASE_URL", f"sqlite:///{location_path}")
monkeypatch.setenv("POO_DATABASE_URL", f"sqlite:///{poo_path}")
monkeypatch.setenv("AUTH_BOOTSTRAP_USERNAME", "admin")
monkeypatch.setenv("AUTH_BOOTSTRAP_PASSWORD", "test-password")
monkeypatch.setenv("AUTH_COOKIE_SECURE_OVERRIDE", "false")
get_settings.cache_clear()
reset_auth_db_caches()
return {
"app_path": app_path,
"app_url": f"sqlite:///{app_path}",
"location_path": location_path,
"location_url": f"sqlite:///{location_path}",
"poo_path": poo_path,
"poo_url": f"sqlite:///{poo_path}",
}
def _create_legacy_location_db(database_path: Path) -> None:
conn = sqlite3.connect(database_path)
conn.execute(
"""
CREATE TABLE location (
person TEXT NOT NULL,
datetime TEXT NOT NULL,
latitude REAL NOT NULL,
longitude REAL NOT NULL,
altitude REAL,
PRIMARY KEY (person, datetime)
)
"""
)
conn.execute(
"INSERT INTO location (person, datetime, latitude, longitude, altitude) VALUES (?, ?, ?, ?, ?)",
("alice", "2026-04-22T10:00:00Z", 1.23, 4.56, 7.89),
)
conn.execute(f"PRAGMA user_version = {LOCATION_USER_VERSION}")
conn.commit()
conn.close()
def _create_legacy_poo_db(database_path: Path) -> None:
conn = sqlite3.connect(database_path)
conn.execute(
"""
CREATE TABLE poo_records (
timestamp TEXT NOT NULL,
status TEXT NOT NULL,
latitude REAL NOT NULL,
longitude REAL NOT NULL,
PRIMARY KEY (timestamp)
)
"""
)
conn.execute(
"INSERT INTO poo_records (timestamp, status, latitude, longitude) VALUES (?, ?, ?, ?)",
("2026-04-22T11:00:00Z", "complete", 9.87, 6.54),
)
conn.execute(f"PRAGMA user_version = {POO_USER_VERSION}")
conn.commit()
conn.close()
def test_compose_uses_migration_job_before_app() -> None:
compose = _read_yaml("docker-compose.yml")
override = _read_yaml("docker-compose.override.yml")
migration_service = compose["services"]["migration"]
app_service = compose["services"]["app"]
assert migration_service["command"] == ["python", "-m", "scripts.run_migrations"]
assert migration_service["restart"] == "no"
assert app_service["depends_on"]["migration"]["condition"] == "service_completed_successfully"
assert override["services"]["migration"]["build"] == "."
assert override["services"]["app"]["build"] == "."
def test_image_defaults_to_uvicorn_only() -> None:
dockerfile = (PROJECT_ROOT / "Dockerfile").read_text()
entrypoint = (PROJECT_ROOT / "docker/entrypoint.sh").read_text()
assert 'CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"]' in dockerfile
assert 'exec "$@"' in entrypoint
assert "app_db_adopt" not in entrypoint
assert "location_db_adopt" not in entrypoint
assert "poo_db_adopt" not in entrypoint
def test_migration_runner_initializes_and_is_idempotent(
tmp_path: Path, monkeypatch: pytest.MonkeyPatch
) -> None:
database_urls = _configure_database_env(tmp_path, monkeypatch)
first_run = run_all_migrations()
second_run = run_all_migrations()
assert first_run == {"app": "initialized", "location": "initialized", "poo": "initialized"}
assert second_run == {
"app": "already_managed",
"location": "already_managed",
"poo": "already_managed",
}
conn = sqlite3.connect(database_urls["app_path"])
try:
assert conn.execute("SELECT version_num FROM alembic_version").fetchone()[0] == APP_BASELINE_REVISION
tables = {
row[0]
for row in conn.execute(
"SELECT name FROM sqlite_master WHERE type = 'table' AND name NOT LIKE 'sqlite_%'"
).fetchall()
}
finally:
conn.close()
assert {"auth_users", "auth_sessions", "app_config", "alembic_version"} <= tables
conn = sqlite3.connect(database_urls["location_path"])
try:
assert conn.execute("SELECT version_num FROM alembic_version").fetchone()[0] == LOCATION_BASELINE_REVISION
finally:
conn.close()
conn = sqlite3.connect(database_urls["poo_path"])
try:
assert conn.execute("SELECT version_num FROM alembic_version").fetchone()[0] == POO_BASELINE_REVISION
finally:
conn.close()
get_settings.cache_clear()
reset_auth_db_caches()
def test_migration_runner_adopts_legacy_sqlite_without_data_loss(
tmp_path: Path, monkeypatch: pytest.MonkeyPatch
) -> None:
database_urls = _configure_database_env(tmp_path, monkeypatch)
_create_legacy_location_db(database_urls["location_path"])
_create_legacy_poo_db(database_urls["poo_path"])
results = run_all_migrations()
assert results == {"app": "initialized", "location": "adopted", "poo": "adopted"}
conn = sqlite3.connect(database_urls["location_path"])
try:
assert conn.execute("SELECT version_num FROM alembic_version").fetchone()[0] == LOCATION_BASELINE_REVISION
assert conn.execute("SELECT COUNT(*) FROM location").fetchone()[0] == 1
finally:
conn.close()
conn = sqlite3.connect(database_urls["poo_path"])
try:
assert conn.execute("SELECT version_num FROM alembic_version").fetchone()[0] == POO_BASELINE_REVISION
assert conn.execute("SELECT COUNT(*) FROM poo_records").fetchone()[0] == 1
finally:
conn.close()
get_settings.cache_clear()
reset_auth_db_caches()
def test_app_startup_still_fails_closed_without_running_adoption(
tmp_path: Path, monkeypatch: pytest.MonkeyPatch
) -> None:
database_urls = _configure_database_env(tmp_path, monkeypatch)
missing_app_path = database_urls["app_path"]
command.upgrade(_make_alembic_config(database_urls["location_url"]), "head")
command.upgrade(_make_poo_alembic_config(database_urls["poo_url"]), "head")
app = create_app()
with pytest.raises(RuntimeError, match="Run 'python scripts/app_db_adopt.py' first"):
anyio.run(_run_lifespan, app)
assert not Path(missing_app_path).exists()
get_settings.cache_clear()
reset_auth_db_caches()
+158
View File
@@ -1,5 +1,21 @@
from sqlalchemy import text
import app.db as app_db
import app.poo_db as poo_db
from app.config import Settings, get_settings
from app.dependencies import get_app_settings, get_homeassistant_client
from app.main import create_app
class _FakeHomeAssistantClient:
def __init__(self) -> None:
self.sensor_calls: list[dict] = []
def publish_sensor(self, *, entity_id: str, state: str, attributes: dict | None = None) -> None:
self.sensor_calls.append(
{"entity_id": entity_id, "state": state, "attributes": attributes or {}}
)
def test_homeassistant_publish_records_location(location_client) -> None:
client, engine = location_client
@@ -141,6 +157,148 @@ def test_homeassistant_publish_rejects_invalid_ticktick_content(location_client)
assert response.text == "bad request"
def test_homeassistant_publish_poo_get_latest_publishes_latest_status(
ready_location_database,
ready_poo_database,
auth_database,
monkeypatch,
) -> None:
location_engine = app_db.create_engine(
ready_location_database["location_url"],
connect_args={"check_same_thread": False},
)
location_session_local = app_db.sessionmaker(
bind=location_engine,
autoflush=False,
autocommit=False,
)
poo_engine = poo_db.create_engine(
ready_poo_database["poo_url"],
connect_args={"check_same_thread": False},
)
poo_session_local = poo_db.sessionmaker(
bind=poo_engine,
autoflush=False,
autocommit=False,
)
fake_ha = _FakeHomeAssistantClient()
settings = Settings(
poo_sensor_entity_name="sensor.test_poo_status",
poo_sensor_friendly_name="Poo Status",
)
monkeypatch.setattr(app_db, "engine", location_engine)
monkeypatch.setattr(app_db, "SessionLocal", location_session_local)
monkeypatch.setattr(poo_db, "poo_engine", poo_engine)
monkeypatch.setattr(poo_db, "PooSessionLocal", poo_session_local)
test_app = create_app()
test_app.dependency_overrides[get_homeassistant_client] = lambda: fake_ha
test_app.dependency_overrides[get_app_settings] = lambda: settings
with poo_engine.begin() as conn:
conn.execute(
text(
"INSERT INTO poo_records (timestamp, status, latitude, longitude) "
"VALUES (:timestamp, :status, :latitude, :longitude)"
),
{
"timestamp": "2026-04-20T10:05Z",
"status": "done",
"latitude": 1.23,
"longitude": 4.56,
},
)
try:
from fastapi.testclient import TestClient
with TestClient(test_app) as client:
response = client.post(
"/homeassistant/publish",
json={
"target": "poo_recorder",
"action": "get_latest",
"content": "",
},
)
assert response.status_code == 200
assert response.text == ""
assert len(fake_ha.sensor_calls) == 1
assert fake_ha.sensor_calls[0]["entity_id"] == "sensor.test_poo_status"
assert fake_ha.sensor_calls[0]["state"] == "done"
assert fake_ha.sensor_calls[0]["attributes"]["friendly_name"] == "Poo Status"
assert fake_ha.sensor_calls[0]["attributes"]["last_poo"]
finally:
test_app.dependency_overrides.clear()
get_settings.cache_clear()
location_engine.dispose()
poo_engine.dispose()
def test_homeassistant_publish_returns_internal_error_for_unknown_poo_action(
ready_location_database,
ready_poo_database,
auth_database,
monkeypatch,
) -> None:
location_engine = app_db.create_engine(
ready_location_database["location_url"],
connect_args={"check_same_thread": False},
)
location_session_local = app_db.sessionmaker(
bind=location_engine,
autoflush=False,
autocommit=False,
)
poo_engine = poo_db.create_engine(
ready_poo_database["poo_url"],
connect_args={"check_same_thread": False},
)
poo_session_local = poo_db.sessionmaker(
bind=poo_engine,
autoflush=False,
autocommit=False,
)
fake_ha = _FakeHomeAssistantClient()
settings = Settings(
poo_sensor_entity_name="sensor.test_poo_status",
poo_sensor_friendly_name="Poo Status",
)
monkeypatch.setattr(app_db, "engine", location_engine)
monkeypatch.setattr(app_db, "SessionLocal", location_session_local)
monkeypatch.setattr(poo_db, "poo_engine", poo_engine)
monkeypatch.setattr(poo_db, "PooSessionLocal", poo_session_local)
test_app = create_app()
test_app.dependency_overrides[get_homeassistant_client] = lambda: fake_ha
test_app.dependency_overrides[get_app_settings] = lambda: settings
try:
from fastapi.testclient import TestClient
with TestClient(test_app) as client:
response = client.post(
"/homeassistant/publish",
json={
"target": "poo_recorder",
"action": "unknown_action",
"content": "",
},
)
assert response.status_code == 500
assert response.text == "internal server error"
assert fake_ha.sensor_calls == []
finally:
test_app.dependency_overrides.clear()
get_settings.cache_clear()
location_engine.dispose()
poo_engine.dispose()
def test_homeassistant_publish_returns_not_implemented_for_unknown_location_action(
location_client,
) -> None:
+1 -1
View File
@@ -343,7 +343,7 @@ def test_location_db_adoption_fails_closed_on_alembic_revision_mismatch(
conn.commit()
conn.close()
with pytest.raises(LocationDatabaseAdoptionError, match="revision does not match"):
with pytest.raises(LocationDatabaseAdoptionError, match="known migration revision"):
adopt_or_initialize_location_db(f"sqlite:///{database_path}")