Compare commits

...

94 Commits

Author SHA1 Message Date
9b0ffc4309 Merge branch 'dev' into agent/Rex/CUB-130-add-edit-filament-form
All checks were successful
Dev Build / build-test (pull_request) Successful in 1m32s
2026-05-15 11:00:00 -04:00
10c9340e74 Merge pull request 'CUB-133: Build Dashboard page with summary cards' (#42) from agent/rex/CUB-133-dashboard-page into dev
All checks were successful
Dev Build / build-test (push) Successful in 1m34s
Reviewed-on: #42
2026-05-15 10:58:22 -04:00
ffff4213b6 Merge branch 'dev' into agent/rex/CUB-133-dashboard-page - resolve App.tsx conflict
All checks were successful
Dev Build / build-test (pull_request) Successful in 1m23s
2026-05-12 15:51:21 -04:00
fc564c6c5a CUB-130: Build Add/Edit Filament form with backend endpoints for finishes and modifiers
All checks were successful
Dev Build / build-test (pull_request) Successful in 2m39s
2026-05-10 14:13:29 -04:00
f1614029b5 Merge pull request 'CUB-132: Build Filament Inventory list page with search and filters' (#43) from agent/Rex/CUB-132-filament-inventory-list into dev
All checks were successful
Dev Build / build-test (push) Successful in 2m14s
2026-05-08 16:26:14 -04:00
1109d1dd2f CUB-132: Build Filament Inventory list page with search and filters
All checks were successful
Dev Build / build-test (pull_request) Successful in 2m10s
2026-05-08 16:22:03 -04:00
32798fbf14 CUB-133: Build Dashboard page with summary cards
All checks were successful
Dev Build / build-test (pull_request) Successful in 2m10s
2026-05-07 20:07:45 -04:00
fd26b205bf Merge pull request 'CUB-136: add SSE endpoint in Go backend' (#41) from agent/dex/CUB-136-sse-endpoint into dev
All checks were successful
Dev Build / build-test (push) Successful in 1m43s
Reviewed-on: #41
2026-05-07 09:10:20 -04:00
41f66005a6 CUB-136: add SSE endpoint in Go backend
All checks were successful
Dev Build / build-test (pull_request) Successful in 2m9s
2026-05-07 08:29:34 -04:00
62d74beba4 CUB-113: implement core CRUD API endpoints
All checks were successful
Dev Build / build-test (push) Successful in 2m3s
2026-05-06 20:57:32 -04:00
fca2ef5b84 CUB-113: implement core CRUD API endpoints
Some checks failed
Dev Build / build-test (pull_request) Failing after 2m4s
- Add dtos package with request/response structs
- Add repositories: Material, Filament, Printer, PrintJob, UsageLog
- Add services: FilamentService, PrinterService, PrintJobService
- Add handlers for all 5 resources with consistent error responses
- Wire all endpoints into Chi router under /api
- Validation on POST/PUT filament endpoints
- Filter/pagination support on list endpoints
- Soft-delete for filaments (DELETE /api/filaments/{id})
- go build ./... && go vet ./... → PASS
2026-05-06 14:24:58 -04:00
3ac8432360 Merge pull request 'CUB-116: Scaffold React frontend — Vite, TS, Tailwind' (#39) from agent/rex/CUB-116-scaffold-react-frontend-v2 into dev
All checks were successful
Dev Build / build-test (push) Successful in 1m34s
Reviewed-on: #39
2026-05-06 14:14:57 -04:00
f15597966f Merge branch 'dev' into agent/rex/CUB-116-scaffold-react-frontend-v2
All checks were successful
Dev Build / build-test (pull_request) Successful in 1m30s
2026-05-06 14:14:36 -04:00
a54fcdd371 CUB-116: scaffold React frontend with Vite, TypeScript, Tailwind
All checks were successful
Dev Build / build-test (pull_request) Successful in 1m26s
2026-05-06 14:02:57 -04:00
1b86d617cd Merge pull request 'CUB-111: Merge PostgreSQL schema and Go models (resolved)' (#38) from fix/CUB-111-merge into dev
Some checks failed
Dev Build / build-test (push) Failing after 1m56s
Reviewed-on: #38
2026-05-06 13:57:47 -04:00
otto-bot
fd39fff433 CUB-111: merge PostgreSQL schema and Go models into dev
Some checks failed
Dev Build / build-test (pull_request) Failing after 1m46s
2026-05-06 13:56:57 -04:00
2243859286 Merge pull request 'CUB-112: Scaffold Go backend' (#37) from agent/dex/CUB-112-go-scaffold into dev
Some checks failed
Dev Build / build-test (push) Failing after 1m49s
Reviewed-on: #37
2026-05-06 13:55:01 -04:00
dex-bot
3fe0850711 CUB-112: scaffold Go backend with Chi, pgx, health check
Some checks failed
Dev Build / build-test (pull_request) Failing after 1m39s
2026-05-06 12:20:31 -04:00
42285c5dac Merge pull request 'CUB-33: Integrate Moonraker filament usage polling' (#33) from agent/dex/CUB-33-moonraker-usage-polling-v2 into dev
Some checks failed
Dev Build / build-test (push) Failing after 2m26s
2026-04-29 17:18:05 -04:00
9cd619b5ee CUB-33: integrate Moonraker filament usage polling
Some checks failed
Dev Build / build-test (pull_request) Failing after 2m21s
2026-04-29 11:50:18 -04:00
ddae95767f Merge pull request 'CUB-35: Build add/edit filament modal' (#20) from agent/rex/CUB-35-filament-add-edit-modal into dev
Some checks failed
Dev Build / build-test (push) Failing after 4m26s
2026-04-29 11:29:32 -04:00
15187cab65 CUB-35: build add/edit filament modal with Angular Material Dialog
Some checks failed
Dev Build / build-test (pull_request) Failing after 2m28s
2026-04-29 11:16:15 -04:00
9112f78641 Merge pull request 'CUB-32: Add usage logging service' (#11) from agent/dex/CUB-32-usage-logging-service into dev
All checks were successful
Dev Build / build-test (push) Successful in 2m48s
Dev Build / build-test (pull_request) Successful in 2m15s
2026-04-29 10:51:36 -04:00
57157ad947 CUB-32: Add usage logging service with EF Core entity, service, controller, and migration
All checks were successful
Dev Build / build-test (pull_request) Successful in 3m11s
2026-04-29 10:23:31 -04:00
a2707e02ee Merge pull request 'CUB-38: Implement low filament alert logic with configurable threshold' (#17) from agent/dex/CUB-38-low-filament-alert into dev
All checks were successful
Dev Build / build-test (push) Successful in 2m14s
2026-04-29 10:11:49 -04:00
9192ece040 CUB-38: implement low filament alert logic with configurable threshold
All checks were successful
Dev Build / build-test (pull_request) Successful in 2m12s
2026-04-28 12:42:03 +00:00
fa4a4c21b3 Merge pull request 'CUB-42: Show filament cost and usage in UI' (#31) from agent/rex/CUB-42-filament-cost-usage-ui into dev
All checks were successful
Dev Build / build-test (push) Successful in 2m10s
Reviewed-on: #31
Reviewed-by: Joshua <joshua@cnjmail.com>
2026-04-28 06:39:51 -04:00
f2d9b7f455 CUB-42: Show filament cost and usage in UI
All checks were successful
Dev Build / build-test (pull_request) Successful in 2m9s
2026-04-27 21:34:47 -04:00
808d5f909d Merge pull request 'CUB-43: Add inventory dashboard summary' (#23) from agent/rex/CUB-43-inventory-dashboard-summary into dev
All checks were successful
Dev Build / build-test (push) Successful in 2m38s
2026-04-27 21:29:25 -04:00
b7e61fab8a CUB-43: Add inventory dashboard summary component
All checks were successful
Dev Build / build-test (pull_request) Successful in 2m19s
2026-04-27 21:25:56 -04:00
5ede6a8eb6 ci: re-trigger pipeline with working-directory fix 2026-04-27 21:25:56 -04:00
e56aa3ba39 CUB-43: add inventory dashboard summary component with FilamentService 2026-04-27 21:25:56 -04:00
f70495a85c Merge pull request 'CUB-9: Implement DELETE /filaments/{id}' (#30) from agent/dex/CUB-9-delete-filaments into dev
All checks were successful
Dev Build / build-test (push) Successful in 2m10s
2026-04-27 21:19:24 -04:00
bb35ed1eab feat(CUB-9): Implement DELETE /filaments/{id}
All checks were successful
Dev Build / build-test (pull_request) Successful in 2m7s
2026-04-27 21:16:56 -04:00
1f03606468 ci: simplify dev pipeline to build-test only (remove deploy/notify stubs)
All checks were successful
Dev Build / build-test (push) Successful in 3m46s
2026-04-27 20:59:09 -04:00
1b4fc22f59 ci: re-trigger pipeline with working-directory fix
Some checks failed
Dev Build / build-test (push) Successful in 2m10s
Dev Build / deploy-dev (push) Failing after 3s
Dev Build / notify-success (push) Has been skipped
Dev Build / notify-failure (push) Successful in 3s
2026-04-27 20:50:42 -04:00
b86dda97a3 Merge pull request 'CUB-8: Create background service for Moonraker mapping' (#29) from agent/dex/CUB-8-background-service-moonraker into dev
Some checks failed
Dev Build / build-test (push) Successful in 2m12s
Dev Build / deploy-dev (push) Failing after 4s
Dev Build / notify-success (push) Has been skipped
Dev Build / notify-failure (push) Successful in 4s
2026-04-27 20:42:50 -04:00
8b2a29881d feat(CUB-8): Create background service for Moonraker mapping
All checks were successful
Dev Build / build-test (pull_request) Successful in 2m7s
Dev Build / deploy-dev (pull_request) Has been skipped
Dev Build / notify-success (pull_request) Has been skipped
Dev Build / notify-failure (pull_request) Has been skipped
2026-04-27 20:40:23 -04:00
90a89eecf3 Merge pull request 'CUB-6: Fix MoonrakerClient namespace to match directory structure' (#28) from agent/dex/CUB-6-moonrakerclient-basic into dev
Some checks failed
Dev Build / build-test (push) Successful in 2m15s
Dev Build / deploy-dev (push) Failing after 3s
Dev Build / notify-success (push) Has been skipped
Dev Build / notify-failure (push) Successful in 3s
2026-04-27 20:32:08 -04:00
215033f3e5 Merge branch 'dev' into agent/dex/CUB-6-moonrakerclient-basic
All checks were successful
Dev Build / build-test (pull_request) Successful in 4m3s
Dev Build / deploy-dev (pull_request) Has been skipped
Dev Build / notify-success (pull_request) Has been skipped
Dev Build / notify-failure (pull_request) Has been skipped
2026-04-27 20:32:03 -04:00
a28d032b16 fix: add working-directory: ./backend to dotnet steps — resolves MSB1003
Some checks failed
Dev Build / deploy-dev (push) Has been cancelled
Dev Build / notify-success (push) Has been cancelled
Dev Build / notify-failure (push) Has been cancelled
Dev Build / build-test (push) Has been cancelled
2026-04-27 20:30:53 -04:00
a90627de28 CUB-6: fix MoonrakerClient namespace to match directory structure
Some checks failed
Dev Build / build-test (pull_request) Failing after 1m0s
Dev Build / deploy-dev (pull_request) Has been skipped
Dev Build / notify-success (pull_request) Has been skipped
Dev Build / notify-failure (pull_request) Successful in 4s
2026-04-27 20:29:25 -04:00
e9e856a012 Merge pull request 'CUB-5: Implement GET /filaments and GET /filaments/{id}' (#27) from agent/dex/CUB-5-get-filaments-endpoints into dev
Some checks failed
Dev Build / deploy-dev (push) Has been skipped
Dev Build / notify-failure (push) Successful in 3s
Dev Build / build-test (push) Failing after 59s
Dev Build / notify-success (push) Has been skipped
2026-04-27 20:21:53 -04:00
46d28676f0 CUB-5: Add 400 BadRequest ProducesResponseType to GET /filaments endpoint
Some checks failed
Dev Build / build-test (pull_request) Failing after 54s
Dev Build / deploy-dev (pull_request) Has been skipped
Dev Build / notify-success (pull_request) Has been skipped
Dev Build / notify-failure (pull_request) Successful in 4s
2026-04-27 20:20:04 -04:00
ed0efd598b Merge pull request 'CUB-7: Implement POST /filaments with validation' (#26) from agent/dex/CUB-7-post-filaments-validation into dev
Some checks failed
Dev Build / build-test (push) Failing after 1m0s
Dev Build / deploy-dev (push) Has been skipped
Dev Build / notify-success (push) Has been skipped
Dev Build / notify-failure (push) Successful in 6s
2026-04-27 19:06:12 -04:00
19415003a2 feat(CUB-7): Add XML doc comments to FilamentValidators constructors
Some checks failed
Dev Build / build-test (pull_request) Failing after 56s
Dev Build / deploy-dev (pull_request) Has been skipped
Dev Build / notify-success (pull_request) Has been skipped
Dev Build / notify-failure (pull_request) Successful in 3s
2026-04-27 19:01:19 -04:00
7904fcda02 Merge pull request 'CUB-10: Create IMoonrakerClient interface and DTOs' (#25) from agent/dex/CUB-10-imoonrakerclient-interface-dtos into dev
Some checks failed
Dev Build / build-test (push) Failing after 1m3s
Dev Build / deploy-dev (push) Has been skipped
Dev Build / notify-success (push) Has been skipped
Dev Build / notify-failure (push) Successful in 3s
2026-04-27 18:50:47 -04:00
3d3b7059cf Merge branch 'dev' into agent/dex/CUB-10-imoonrakerclient-interface-dtos
Some checks failed
Dev Build / build-test (pull_request) Failing after 57s
Dev Build / deploy-dev (pull_request) Has been skipped
Dev Build / notify-success (pull_request) Has been skipped
Dev Build / notify-failure (pull_request) Successful in 3s
2026-04-27 18:50:41 -04:00
fc6134b162 docs: add comprehensive README
Some checks failed
Dev Build / deploy-dev (push) Has been cancelled
Dev Build / notify-success (push) Has been cancelled
Dev Build / notify-failure (push) Has been cancelled
Dev Build / build-test (push) Has been cancelled
2026-04-27 18:49:26 -04:00
51bfb6d115 CUB-10: Create IMoonrakerClient interface and DTOs
Some checks failed
Dev Build / build-test (pull_request) Failing after 58s
Dev Build / deploy-dev (pull_request) Has been skipped
Dev Build / notify-success (pull_request) Has been skipped
Dev Build / notify-failure (pull_request) Successful in 6s
- Expanded IMoonrakerClient interface with 6 strongly-typed methods:
  - GetServerInfoAsync (Moonraker /server/info)
  - IsReachableAsync (connectivity check)
  - GetPrinterInfoAsync (Moonraker /printer/info)
  - GetPrintHistoryAsync (Moonraker /server/history/items)
  - GetPrintStatsAsync (Moonraker /printer/objects/query?print_stats)
  - GetDisplayStatusAsync (Moonraker /printer/objects/query?display_status)
  - GetFilamentUsageAsync (retained for backward compatibility)

- Created Domain/DTOs/Moonraker/ with 7 DTOs:
  - MoonrakerServerInfo, MoonrakerPrinterInfo, MoonrakerPrintJob
  - MoonrakerHistoryResponse, MoonrakerPrintStats
  - MoonrakerDisplayStatus, MoonrakerRequest

- Updated MoonrakerClient implementation to support all new methods
  with proper JSON parsing and mapping helpers

- Full XML doc comments on all public members
2026-04-27 18:42:47 -04:00
aa182af979 Merge pull request 'feat(CUB-28): [Extrudex] Define filament inventory database entities' (#24) from agent/hex/CUB-28-filament-inventory-entities into dev
Some checks failed
Dev Build / build-test (push) Failing after 54s
Dev Build / deploy-dev (push) Has been skipped
Dev Build / notify-success (push) Has been skipped
Dev Build / notify-failure (push) Successful in 3s
2026-04-27 18:28:26 -04:00
ac033859a8 feat(CUB-28): [Extrudex] Define filament inventory database entities
Some checks failed
Dev Build / build-test (pull_request) Failing after 1m3s
Dev Build / deploy-dev (pull_request) Has been skipped
Dev Build / notify-success (pull_request) Has been skipped
Dev Build / notify-failure (pull_request) Successful in 3s
Add storage_location and is_archived fields to Spool entity to complete
the filament inventory entity definition per CUB-28 requirements.

Changes:
- Spool entity: add IsArchived (bool, default false) and StorageLocation
  (nullable string, max 200) for physical inventory tracking
- SpoolConfiguration: add snake_case column mappings, defaults, and indexes
  (ix_spools_is_archived, ix_spools_active_archived composite)
- FilamentDtos: add IsArchived + StorageLocation to Response, Create, Update
- FilamentQueryDtos: add IncludeArchived and StorageLocation query filters
- FilamentsController: wire new fields into query, create, update, mapping
- FilamentValidators: add StorageLocation max-length validation

Build: PASS (0 errors)
2026-04-27 18:24:52 -04:00
c3a0f210a1 Merge pull request 'CUB-64: Docker Runtime Setup for Development & Deployment' (#14) from agent/dex/CUB-64-docker-runtime-setup into dev
Some checks failed
Dev Build / build-test (push) Failing after 59s
Dev Build / deploy-dev (push) Has been skipped
Dev Build / notify-success (push) Has been skipped
Dev Build / notify-failure (push) Successful in 3s
Reviewed-on: #14
2026-04-27 17:34:40 -04:00
2017843dc1 Merge branch 'dev' into agent/dex/CUB-64-docker-runtime-setup
Some checks failed
Dev Build / build-test (pull_request) Failing after 58s
Dev Build / deploy-dev (pull_request) Has been skipped
Dev Build / notify-success (pull_request) Has been skipped
Dev Build / notify-failure (pull_request) Successful in 3s
2026-04-27 17:34:26 -04:00
c150f54c64 Merge pull request 'feat(CUB-39): Create background job for filament usage sync' (#16) from agent/dex/CUB-39-filament-usage-sync into dev
Some checks failed
Dev Build / deploy-dev (push) Has been cancelled
Dev Build / notify-success (push) Has been cancelled
Dev Build / notify-failure (push) Has been cancelled
Dev Build / build-test (push) Has been cancelled
Reviewed-on: #16
2026-04-27 17:33:59 -04:00
73363206ec Merge branch 'dev' into agent/dex/CUB-39-filament-usage-sync
Some checks failed
Dev Build / build-test (pull_request) Failing after 1m3s
Dev Build / deploy-dev (pull_request) Has been skipped
Dev Build / notify-success (pull_request) Has been skipped
Dev Build / notify-failure (pull_request) Successful in 4s
2026-04-27 17:25:58 -04:00
174dd294e9 Merge pull request 'CUB-37: Implement cost-per-print calculation service' (#18) from agent/dex/CUB-37-cost-per-print into dev
Some checks failed
Dev Build / deploy-dev (push) Has been skipped
Dev Build / notify-success (push) Has been skipped
Dev Build / build-test (push) Failing after 1m3s
Dev Build / notify-failure (push) Successful in 3s
Reviewed-on: #18
2026-04-27 17:25:37 -04:00
0378aee43e Merge branch 'dev' into agent/dex/CUB-37-cost-per-print
Some checks failed
Dev Build / build-test (pull_request) Failing after 1m0s
Dev Build / deploy-dev (pull_request) Has been skipped
Dev Build / notify-success (pull_request) Has been skipped
Dev Build / notify-failure (pull_request) Successful in 5s
2026-04-27 17:25:22 -04:00
72a39ec766 Merge pull request 'CUB-34: Add filament filter bar with material type, color, and low stock filters' (#21) from agent/rex/CUB-34-filament-list-ui into dev
Some checks failed
Dev Build / build-test (push) Failing after 51s
Dev Build / deploy-dev (push) Has been skipped
Dev Build / notify-success (push) Has been skipped
Dev Build / notify-failure (push) Successful in 3s
Reviewed-on: #21
Reviewed-by: Joshua <joshua@cnjmail.com>
2026-04-27 17:14:55 -04:00
c05b9dd87d merge(dev): Re-apply CUB-34 changes after merge conflict resolution
Some checks failed
Dev Build / build-test (pull_request) Failing after 54s
Dev Build / deploy-dev (pull_request) Has been skipped
Dev Build / notify-success (pull_request) Has been skipped
Dev Build / notify-failure (pull_request) Successful in 3s
2026-04-27 17:02:25 -04:00
5a577e1871 Merge remote-tracking branch 'origin/dev' into fix-pr-21
# Conflicts:
#	frontend/src/app/components/filament-table/filament-table.component.html
#	frontend/src/app/components/filament-table/filament-table.component.ts
2026-04-27 17:02:25 -04:00
2e8227c3f9 Merge pull request 'CUB-36: Add delete confirmation dialog for filament spool removal' (#19) from agent/rex/CUB-36-delete-confirmation into dev
Some checks failed
Dev Build / build-test (push) Failing after 55s
Dev Build / deploy-dev (push) Has been skipped
Dev Build / notify-success (push) Has been skipped
Dev Build / notify-failure (push) Successful in 4s
Reviewed-on: #19
2026-04-27 15:24:55 -04:00
d207c49ffd CUB-34: add filament filter bar with material type, color, and low stock filters
Some checks failed
Dev Build / deploy-dev (pull_request) Has been skipped
Dev Build / notify-success (pull_request) Has been skipped
Dev Build / build-test (pull_request) Failing after 54s
Dev Build / notify-failure (pull_request) Successful in 6s
2026-04-27 15:08:31 -04:00
5b9dde13fe Merge remote-tracking branch 'origin/dev' into fix-pr-18
Some checks failed
Dev Build / build-test (pull_request) Failing after 54s
Dev Build / deploy-dev (pull_request) Has been skipped
Dev Build / notify-success (pull_request) Has been skipped
Dev Build / notify-failure (pull_request) Successful in 4s
# Conflicts:
#	backend/API/Controllers/PrintJobsController.cs
2026-04-27 14:30:05 -04:00
fd9fcd47ab Merge remote-tracking branch 'origin/dev' into fix-pr-14
Some checks failed
Dev Build / build-test (pull_request) Failing after 58s
Dev Build / deploy-dev (pull_request) Has been skipped
Dev Build / notify-success (pull_request) Has been skipped
Dev Build / notify-failure (pull_request) Successful in 3s
# Conflicts:
#	frontend/.dockerignore
#	frontend/Dockerfile
#	frontend/nginx.conf
2026-04-27 14:30:03 -04:00
cfd4a81b5f Merge branch 'dev' into agent/rex/CUB-36-delete-confirmation
Some checks failed
Dev Build / build-test (pull_request) Failing after 52s
Dev Build / deploy-dev (pull_request) Has been skipped
Dev Build / notify-success (pull_request) Has been skipped
Dev Build / notify-failure (pull_request) Successful in 3s
2026-04-27 14:17:44 -04:00
8a2f97d2cd Merge pull request 'CUB-40: Add cost summary API endpoint' (#15) from agent/dex/CUB-40-cost-summary-api into dev
Some checks failed
Dev Build / build-test (push) Failing after 52s
Dev Build / deploy-dev (push) Has been skipped
Dev Build / notify-success (push) Has been skipped
Dev Build / notify-failure (push) Successful in 3s
Reviewed-on: #15
2026-04-27 14:17:30 -04:00
d43985cad9 Merge branch 'dev' into agent/dex/CUB-39-filament-usage-sync
Some checks failed
Dev Build / build-test (pull_request) Failing after 52s
Dev Build / deploy-dev (pull_request) Has been skipped
Dev Build / notify-success (pull_request) Has been skipped
Dev Build / notify-failure (pull_request) Successful in 3s
2026-04-27 14:15:16 -04:00
b43edad5f0 Merge branch 'dev' into agent/dex/CUB-40-cost-summary-api
Some checks failed
Dev Build / build-test (pull_request) Failing after 52s
Dev Build / deploy-dev (pull_request) Has been skipped
Dev Build / notify-success (pull_request) Has been skipped
Dev Build / notify-failure (pull_request) Successful in 3s
2026-04-27 14:14:13 -04:00
f5ca20307e CUB-36: add delete confirmation dialog for filament spool removal
Some checks failed
Dev Build / build-test (pull_request) Failing after 53s
Dev Build / deploy-dev (pull_request) Has been skipped
Dev Build / notify-success (pull_request) Has been skipped
Dev Build / notify-failure (pull_request) Successful in 3s
2026-04-27 18:12:58 +00:00
12888c4f3f Merge pull request 'CUB-66: Frontend Dockerfile (Angular Static Build)' (#12) from agent/rex/CUB-64-frontend-dockerfile into dev
Some checks failed
Dev Build / build-test (push) Failing after 51s
Dev Build / deploy-dev (push) Has been skipped
Dev Build / notify-success (push) Has been skipped
Dev Build / notify-failure (push) Successful in 3s
Reviewed-on: #12
Reviewed-by: Otto the Minion <otto@code.cubecraftcreations.com>
2026-04-27 14:11:55 -04:00
1411b68a95 Merge branch 'dev' into agent/rex/CUB-64-frontend-dockerfile
Some checks failed
Dev Build / build-test (pull_request) Failing after 50s
Dev Build / deploy-dev (pull_request) Has been skipped
Dev Build / notify-success (pull_request) Has been skipped
Dev Build / notify-failure (pull_request) Successful in 3s
2026-04-27 14:11:50 -04:00
7daa7d637c Merge pull request 'CUB-31: Add filament usage tracking model' (#10) from agent/hex/CUB-31-filament-usage-tracking-model into dev
Some checks failed
Dev Build / build-test (push) Failing after 50s
Dev Build / deploy-dev (push) Has been skipped
Dev Build / notify-success (push) Has been skipped
Dev Build / notify-failure (push) Successful in 3s
Reviewed-on: #10
Reviewed-by: Otto the Minion <otto@code.cubecraftcreations.com>
2026-04-27 14:09:22 -04:00
c88ad43530 Merge branch 'dev' into agent/hex/CUB-31-filament-usage-tracking-model
Some checks failed
Dev Build / build-test (pull_request) Failing after 54s
Dev Build / deploy-dev (pull_request) Has been skipped
Dev Build / notify-success (pull_request) Has been skipped
Dev Build / notify-failure (pull_request) Successful in 3s
2026-04-27 14:09:13 -04:00
6aa31f4be3 CUB-37: implement cost-per-print calculation service
Some checks failed
Dev Build / build-test (pull_request) Failing after 48s
Dev Build / deploy-dev (pull_request) Has been skipped
Dev Build / notify-success (pull_request) Has been skipped
Dev Build / notify-failure (pull_request) Successful in 3s
2026-04-27 17:57:57 +00:00
4ba98966eb feat(CUB-39): create background job for filament usage sync
Some checks failed
Dev Build / build-test (pull_request) Failing after 48s
Dev Build / deploy-dev (pull_request) Has been skipped
Dev Build / notify-success (pull_request) Has been skipped
Dev Build / notify-failure (pull_request) Successful in 3s
2026-04-27 17:23:24 +00:00
c1a115c938 feat(CUB-40): [Extrudex] Add cost summary API endpoint
Some checks failed
Dev Build / build-test (pull_request) Failing after 47s
Dev Build / deploy-dev (pull_request) Has been skipped
Dev Build / notify-success (pull_request) Has been skipped
Dev Build / notify-failure (pull_request) Successful in 3s
2026-04-27 17:09:08 +00:00
61178ebb7b feat(CUB-64): Docker runtime setup for development & deployment
Some checks failed
Dev Build / build-test (pull_request) Failing after 47s
Dev Build / deploy-dev (pull_request) Has been skipped
Dev Build / notify-success (pull_request) Has been skipped
Dev Build / notify-failure (pull_request) Successful in 3s
- Backend Dockerfile: added curl install for health check (not in aspnet base image)
- Frontend Dockerfile: multi-stage Angular build with nginx serving
- Frontend nginx.conf: SPA routing, API proxy, SignalR WebSocket support, health endpoint
- Frontend .dockerignore: excludes node_modules, dist, .angular, etc.
- docker-compose.dev.yml: added PostgreSQL service, fixed frontend context path,
  renamed web service from control-center-web to extrudex-web, added DB env vars,
  proper service dependencies with health checks
- deploy.sh: updated service list to include PostgreSQL port
2026-04-27 08:33:18 +00:00
920042acac fix(CUB-66): Resolve merge conflicts - keep only Docker setup, remove duplicate Angular app files
Some checks failed
Dev Build / build-test (pull_request) Failing after 1m4s
Dev Build / deploy-dev (pull_request) Has been skipped
Dev Build / notify-success (pull_request) Has been skipped
Dev Build / notify-failure (pull_request) Successful in 5s
2026-04-27 02:23:27 +00:00
8168d25bdf Merge pull request 'CUB-41: Add low stock indicators to filament UI' (#5) from agent/rex/CUB-41-low-stock-indicators into dev
Some checks failed
Dev Build / build-test (push) Failing after 57s
Dev Build / deploy-dev (push) Has been skipped
Dev Build / notify-success (push) Has been skipped
Dev Build / notify-failure (push) Successful in 3s
Reviewed-on: #5
Reviewed-by: Joshua <joshua@cnjmail.com>
2026-04-26 17:23:40 -04:00
fc4c9cf397 Merge branch 'dev' into agent/rex/CUB-41-low-stock-indicators
Some checks failed
Dev Build / build-test (pull_request) Failing after 1m0s
Dev Build / deploy-dev (pull_request) Has been skipped
Dev Build / notify-success (pull_request) Has been skipped
Dev Build / notify-failure (pull_request) Successful in 3s
2026-04-26 17:23:32 -04:00
d5b5b44dc2 Merge pull request 'CUB-30: Implement PUT /filaments/{id} update endpoint' (#4) from agent/dex/CUB-30-put-filament-endpoint into dev
Some checks failed
Dev Build / deploy-dev (push) Has been cancelled
Dev Build / notify-success (push) Has been cancelled
Dev Build / notify-failure (push) Has been cancelled
Dev Build / build-test (push) Has been cancelled
Reviewed-on: #4
Reviewed-by: Joshua <joshua@cnjmail.com>
2026-04-26 17:23:12 -04:00
0cd8bb1939 Merge branch 'dev' into agent/dex/CUB-30-put-filament-endpoint
Some checks failed
Dev Build / build-test (pull_request) Failing after 1m2s
Dev Build / deploy-dev (pull_request) Has been skipped
Dev Build / notify-success (pull_request) Has been skipped
Dev Build / notify-failure (pull_request) Successful in 3s
2026-04-26 17:22:15 -04:00
cubecraft-agents[bot]
1ee7562e81 CUB-66: scaffold Angular frontend and add Dockerfile with nginx
Some checks failed
Dev Build / build-test (pull_request) Failing after 58s
Dev Build / deploy-dev (pull_request) Has been skipped
Dev Build / notify-success (pull_request) Has been skipped
Dev Build / notify-failure (pull_request) Successful in 4s
- Scaffolded Angular 21 app in frontend/ (standalone, routing, scss)
- Multi-stage Dockerfile: node:22-alpine build → nginx:alpine serve
- nginx.conf with SPA routing fallback, API proxy, gzip, asset caching
- .dockerignore excludes node_modules, dist, .angular, spec files
- docker build → PASS, container serves UI on port 80 (HTTP 200)
- Final image: 92.9MB (nginx:alpine)
2026-04-26 20:10:01 +00:00
311dd2ee7f feat(CUB-31): [Extrudex] Add filament usage tracking model
Some checks failed
Dev Build / build-test (pull_request) Failing after 58s
Dev Build / deploy-dev (pull_request) Has been skipped
Dev Build / notify-success (pull_request) Has been skipped
Dev Build / notify-failure (pull_request) Successful in 4s
2026-04-26 18:35:37 +00:00
7d0369b8e9 chore: add Docker deployment setup and health check wiring
Some checks failed
Dev Build / build-test (push) Failing after 1m1s
Dev Build / deploy-dev (push) Has been skipped
Dev Build / notify-success (push) Has been skipped
Dev Build / notify-failure (push) Successful in 4s
- Add multi-stage Dockerfile for backend (SDK build → ASP.NET runtime,
  non-root user, /health HEALTHCHECK)
- Add docker-compose.dev.yml orchestrating extrudex-api + control-center-web
- Add deploy.sh convenience script wrapping docker compose up --build
- Wire ASP.NET health checks: AddHealthChecks().AddNpgSql() + MapHealthChecks("/health")
- Add backend .dockerignore (comprehensive pattern list)
- Exclude frontend/dist, frontend/node_modules, frontend/.angular from git

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-26 17:28:06 +00:00
ff1fb621d7 Merge pull request 'CUB-29: Create filament inventory database migration' (#9) from agent/hex/CUB-29-filament-migration into dev
Some checks failed
Dev Build / build-test (push) Failing after 59s
Dev Build / deploy-dev (push) Has been skipped
Dev Build / notify-success (push) Has been skipped
Dev Build / notify-failure (push) Successful in 3s
Reviewed-on: #9
Reviewed-by: Joshua <joshua@cnjmail.com>
2026-04-26 13:01:05 -04:00
c8ac1fa283 Merge branch 'dev' into agent/hex/CUB-29-filament-migration
Some checks failed
Dev Build / build-test (pull_request) Failing after 59s
Dev Build / deploy-dev (pull_request) Has been skipped
Dev Build / notify-success (pull_request) Has been skipped
Dev Build / notify-failure (pull_request) Successful in 4s
2026-04-26 13:00:29 -04:00
Joshua King
4172e21fd1 update dev workflow to use ubuntu-latest for build-test job
Some checks failed
Dev Build / build-test (push) Failing after 3m19s
Dev Build / deploy-dev (push) Has been skipped
Dev Build / notify-success (push) Has been skipped
Dev Build / notify-failure (push) Successful in 3s
Co-authored-by: Copilot <copilot@github.com>
2026-04-26 12:56:06 -04:00
Joshua King
c3def21220 add notifications for Slack on success and failure of dev deployment
Some checks failed
Dev Build / build-test (push) Has been cancelled
Dev Build / deploy-dev (push) Has been cancelled
Dev Build / notify-success (push) Has been cancelled
Dev Build / notify-failure (push) Has been cancelled
2026-04-26 12:52:22 -04:00
Joshua King
458fc9a4e1 add dev workflow for building and deploying backend and frontend
Some checks failed
Dev Build / build-test (push) Has been cancelled
Dev Build / deploy-dev (push) Has been cancelled
2026-04-26 11:51:31 -04:00
cubecraft-agents[bot]
3d67610575 CUB-41: Add low stock indicators to filament UI 2026-04-26 14:27:08 +00:00
cubecraft-agents[bot]
9cd27e213b CUB-30: Implement PUT /filaments/{id} update endpoint
- Add FluentValidation validators for CreateFilamentRequest and UpdateFilamentRequest
  with comprehensive validation rules (required fields, string lengths, hex color format,
  weight constraints including WeightRemainingGrams <= WeightTotalGrams, purchase price range)
- Add FluentValidationFilter action filter that auto-runs FluentValidation validators
  for all API controller actions before execution, returning 400 with structured error details
- Register FluentValidationFilter in DI and add it to MVC controller filters in Program.cs
- PUT endpoint was already implemented in FilamentsController with proper validation,
  404 handling, FK existence checks, serial uniqueness check, and weight constraint check
- This change ensures FluentValidation rules are enforced consistently via the pipeline
2026-04-26 13:26:26 +00:00
cubecraft-agents[bot]
a0cdacc7be CUB-29: Create filament inventory database migration 2026-04-26 13:16:13 +00:00
128 changed files with 17896 additions and 15 deletions

44
.gitea/workflows/dev.yml Normal file
View File

@@ -0,0 +1,44 @@
name: Dev Build
on:
pull_request:
branches: [dev]
push:
branches: [dev]
jobs:
build-test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Setup .NET
uses: actions/setup-dotnet@v4
with:
dotnet-version: '9.0.x'
- name: Restore backend
run: dotnet restore
working-directory: ./backend
- name: Build backend
run: dotnet build --no-restore --configuration Release
working-directory: ./backend
- name: Test backend
run: dotnet test --no-build --configuration Release
working-directory: ./backend
- name: Setup Node
uses: actions/setup-node@v4
with:
node-version: "24"
- name: Install frontend deps
run: npm ci
working-directory: ./frontend
- name: Build frontend
run: npm run build
working-directory: ./frontend

7
.gitignore vendored
View File

@@ -2,4 +2,9 @@ bin/
obj/
*.user
*.suo
.vs/
.vs/
# Frontend build artifacts
frontend/dist/
frontend/node_modules/
frontend/.angular/

220
README.md
View File

@@ -0,0 +1,220 @@
# Extrudex
> Filament inventory and print tracking system for CubeCraft Creations.
Extrudex replaces Spoolman with a fully custom solution built for Joshua's 7-printer fleet. It tracks spool stock, per-print material consumption, and cost-of-goods — with a touch-optimized kiosk interface on a Raspberry Pi 5.
---
## Tech Stack
| Layer | Technology |
|---|---|
| Backend | ASP.NET Core Web API (.NET 8) |
| Database | PostgreSQL (snake_case via EF Core) |
| ORM | Entity Framework Core |
| Real-time | SignalR (`PrinterHub`) |
| Printer integration | Moonraker REST/WebSocket (Elegoo) · MQTTnet + TLS (Bambu Lab) |
| Frontend | Angular 17+, Angular Material |
| Deployment | Docker · Docker Compose |
---
## Project Structure
```
Extrudex/
├── backend/
│ ├── Domain/
│ │ ├── Base/ # BaseEntity, AuditableEntity
│ │ ├── Entities/ # Spool, Printer, PrintJob, FilamentUsage,
│ │ │ # AmsUnit, AmsSlot, MaterialBase,
│ │ │ # MaterialFinish, MaterialModifier
│ │ ├── Enums/ # ConnectionType, DataSource, JobStatus,
│ │ │ # PrinterStatus, PrinterType, QrResourceType
│ │ └── Interfaces/ # ICostPerPrintService, IFilamentUsageSyncService,
│ │ # IMoonrakerClient, IQrCodeService
│ ├── Infrastructure/
│ │ ├── Configuration/ # FilamentUsageSyncOptions
│ │ ├── Data/
│ │ │ ├── Configurations/ # EF Core fluent configs (snake_case)
│ │ │ ├── Migrations/ # EF migrations
│ │ │ ├── Seed/ # SeedData.cs
│ │ │ └── ExtrudexDbContext.cs
│ │ └── Services/ # CostPerPrintService, FilamentUsageSyncService,
│ │ # MoonrakerClient, QrCodeService
│ └── API/
│ ├── Controllers/ # Filaments, Spools, Printers, PrintJobs,
│ │ # MaterialBases, MaterialFinishes,
│ │ # MaterialModifiers, MaterialLookups,
│ │ # CostAnalysis, QR
│ ├── DTOs/ # Request/response shapes per domain
│ ├── Filters/ # FluentValidationFilter
│ ├── Hubs/ # PrinterHub, IPrinterClient
│ ├── Jobs/ # FilamentUsageSyncJob (background)
│ ├── Validators/ # FluentValidation validators
│ ├── Program.cs
│ └── appsettings.json
├── frontend/
│ └── src/app/
│ ├── components/ # DashboardSummary, FilamentFilter, FilamentTable
│ ├── models/ # Filament, Agent model types
│ └── app.routes.ts
├── design/ # UX specs and mockups (kiosk + mobile)
├── docker-compose.dev.yml
├── deploy.sh
└── README.md
```
---
## Domain Model
### Materials (normalized taxonomy)
| Entity | Description |
|---|---|
| `MaterialBase` | The base material type — PLA, PETG, ABS, ASA, TPU, etc. |
| `MaterialFinish` | Required. Surface finish — Basic (default), Matte, Silk, Sparkle, etc. |
| `MaterialModifier` | Optional. Composite fill — Carbon Fiber, Glass Fiber, Wood, etc. |
**Rules:**
- `MaterialFinish` is required — every spool must have one. Default is `"Basic"`.
- `MaterialModifier` is optional — plain PLA has no modifier.
### Consumption calculation
```
grams_used = mm_extruded × filament_cross_section_area × material_density
```
Grams are always derived, never assumed from printer telemetry directly.
### Printers
| Type | Integration |
|---|---|
| Bambu Lab (×5) | MQTTnet with TLS |
| Elegoo Centauri Carbon | Moonraker REST + WebSocket |
| Elegoo Saturn (resin ×2) | Manual / future |
AMS units and slots are modelled as `AmsUnit``AmsSlot[]``Spool`.
---
## Key Design Decisions
1. **Spoolman rejected** — Full custom system for data model control and workflow flexibility.
2. **`"Basic"` not `"Standard"`** — Default `MaterialFinish` value is `Basic`.
3. **`MaterialFinish` is required** — No null/optional finish state allowed.
4. **`MaterialModifier` is optional** — Not every spool has a modifier.
5. **Derived consumption** — Grams calculated from mm × density, never assumed.
6. **Push over poll** — SignalR and MQTT preferred over periodic polling.
7. **Snake_case PostgreSQL** — All database identifiers follow this convention via EF Core.
---
## Getting Started
### Prerequisites
- .NET 8 SDK
- Node.js 20+
- Docker + Docker Compose
- PostgreSQL (or use the dev compose stack)
### Backend
```bash
cd backend
# Restore and build
dotnet restore
dotnet build
# Apply migrations
dotnet ef database update
# Run API (dev)
dotnet run --project API
```
API runs at `http://localhost:5000` · Swagger at `http://localhost:5000/swagger`
### Frontend
```bash
cd frontend
npm install
ng serve
```
Frontend runs at `http://localhost:4200`
### Docker (dev stack)
```bash
docker-compose -f docker-compose.dev.yml up
```
---
## Configuration
`backend/appsettings.json` — override in `appsettings.Development.json` or environment variables:
| Key | Default | Description |
|---|---|---|
| `ConnectionStrings:ExtrudexDb` | `Host=localhost;...` | PostgreSQL connection string |
| `FilamentUsageSync:PollingInterval` | `00:05:00` | Sync job interval |
| `FilamentUsageSync:RequestTimeout` | `00:00:30` | Moonraker request timeout |
| `FilamentUsageSync:Enabled` | `true` | Enable/disable background sync |
---
## Real-Time Events
SignalR hub endpoint: `/hubs/printer`
Clients receive `PrinterHub` events for live printer status, job progress, and spool consumption updates.
---
## API Overview
| Route prefix | Resource |
|---|---|
| `/api/filaments` | Filament catalog |
| `/api/spools` | Spool inventory |
| `/api/printers` | Printer registry |
| `/api/print-jobs` | Print job tracking |
| `/api/material-bases` | Material base types |
| `/api/material-finishes` | Material finishes |
| `/api/material-modifiers` | Material modifiers |
| `/api/material-lookups` | Combined material lookup |
| `/api/cost-analysis` | Cost-per-print and COGS |
| `/api/qr` | QR code generation |
Full schema available at `/swagger` when running in dev.
---
## CI
Gitea Actions pipeline (`.gitea/workflows/dev.yml`) runs on every push to `dev`:
- `dotnet build`
- Frontend `ng build`
---
## Branch & PR Rules
- All feature branches target `dev`**never `main`**
- Branch naming: `agent/<agent>/CUB-N-short-description`
- PR titles: `CUB-N: short description`
- PRs require Otto review before Joshua merges
---
*Built by CubeCraft Creations · Orchestrated by Otto*

27
backend/.dockerignore Normal file
View File

@@ -0,0 +1,27 @@
# Build artifacts
bin/
obj/
# IDE / editor
.vs/
.vscode/
*.user
*.suo
.idea/
# Environment & secrets
appsettings.Development.json
.env
.env.*
# Docker
Dockerfile
.dockerignore
# OS
.DS_Store
Thumbs.db
# Misc
*.md
*.log

View File

@@ -0,0 +1,108 @@
using Extrudex.API.DTOs.PrintJobs;
using Extrudex.Domain.Interfaces;
using Microsoft.AspNetCore.Mvc;
namespace Extrudex.API.Controllers;
/// <summary>
/// Controller for cost analysis endpoints. Provides spool-level
/// cost breakdowns and aggregated COGS reporting.
/// </summary>
[ApiController]
[Route("api/cost-analysis")]
public class CostAnalysisController : ControllerBase
{
private readonly ICostPerPrintService _costService;
private readonly ILogger<CostAnalysisController> _logger;
/// <summary>
/// Initializes a new instance of the <see cref="CostAnalysisController"/> class.
/// </summary>
/// <param name="costService">The cost-per-print calculation service.</param>
/// <param name="logger">The logger for diagnostic output.</param>
public CostAnalysisController(
ICostPerPrintService costService,
ILogger<CostAnalysisController> logger)
{
_costService = costService;
_logger = logger;
}
// ── POST /api/cost-analysis/spool ────────────────────────────
/// <summary>
/// Calculates cost breakdowns for all print jobs associated with a specific spool.
/// Returns per-job costs plus an aggregated total. Jobs with missing cost data
/// include warnings and null cost fields — the endpoint never throws for missing data.
/// </summary>
/// <param name="request">The request containing the spool identifier.</param>
/// <returns>A spool-level cost summary with per-job breakdowns.</returns>
/// <response code="200">Returns the spool cost breakdown with per-job details.</response>
/// <response code="404">If the spool has no print jobs.</response>
[HttpPost("spool")]
[ProducesResponseType(typeof(SpoolCostResponse), StatusCodes.Status200OK)]
[ProducesResponseType(StatusCodes.Status404NotFound)]
public async Task<ActionResult<SpoolCostResponse>> CalculateSpoolCost([FromBody] SpoolCostRequest request)
{
_logger.LogDebug("Calculating cost breakdown for spool {SpoolId}", request.SpoolId);
var results = await _costService.CalculateBySpoolAsync(request.SpoolId);
if (results.Count == 0)
{
return NotFound(new { error = $"No print jobs found for spool with ID '{request.SpoolId}'." });
}
// Build the spool-level summary
var firstResult = results[0];
var jobResponses = results.Select(MapCostToResponse).ToList();
// Aggregate total cost and grams — only include jobs that have a valid cost
var calculableJobs = results.Where(r => r.CostPerPrint.HasValue).ToList();
var totalCost = calculableJobs.Count == results.Count
? Math.Round(calculableJobs.Sum(r => r.CostPerPrint!.Value), 4)
: (decimal?)null;
var aggregateWarnings = new List<string>();
if (calculableJobs.Count < results.Count)
{
aggregateWarnings.Add(
$"{results.Count - calculableJobs.Count} of {results.Count} print jobs have missing cost data. " +
"Total cost reflects only jobs with complete data.");
}
var response = new SpoolCostResponse
{
SpoolId = request.SpoolId,
SpoolSerial = firstResult.SpoolSerial,
PurchasePrice = firstResult.PurchasePrice,
WeightTotalGrams = firstResult.WeightTotalGrams,
CostPerGram = firstResult.CostPerGram,
TotalGramsConsumed = results.Sum(r => r.GramsDerived),
TotalCost = totalCost,
JobCount = results.Count,
Jobs = jobResponses,
Warnings = aggregateWarnings
};
return Ok(response);
}
/// <summary>
/// Maps a domain CostPerPrintResult to an API CostPerPrintResponse DTO.
/// </summary>
private static CostPerPrintResponse MapCostToResponse(CostPerPrintResult r) => new()
{
PrintJobId = r.PrintJobId,
PrintName = r.PrintName,
SpoolId = r.SpoolId,
SpoolSerial = r.SpoolSerial,
MmExtruded = r.MmExtruded,
GramsDerived = r.GramsDerived,
PurchasePrice = r.PurchasePrice,
WeightTotalGrams = r.WeightTotalGrams,
CostPerGram = r.CostPerGram,
CostPerPrint = r.CostPerPrint,
Warnings = r.Warnings
};
}

View File

@@ -1,6 +1,7 @@
using Extrudex.API.DTOs;
using Extrudex.API.DTOs.Filaments;
using Extrudex.Domain.Entities;
using Extrudex.Domain.Interfaces;
using Extrudex.Infrastructure.Data;
using Microsoft.AspNetCore.Mvc;
using Microsoft.EntityFrameworkCore;
@@ -17,16 +18,22 @@ namespace Extrudex.API.Controllers;
public class FilamentsController : ControllerBase
{
private readonly ExtrudexDbContext _dbContext;
private readonly ILowStockDetector _lowStockDetector;
private readonly ILogger<FilamentsController> _logger;
/// <summary>
/// Initializes a new instance of the <see cref="FilamentsController"/> class.
/// </summary>
/// <param name="dbContext">The database context for data access.</param>
/// <param name="lowStockDetector">The low-stock detection service for filament alerts.</param>
/// <param name="logger">The logger for diagnostic output.</param>
public FilamentsController(ExtrudexDbContext dbContext, ILogger<FilamentsController> logger)
public FilamentsController(
ExtrudexDbContext dbContext,
ILowStockDetector lowStockDetector,
ILogger<FilamentsController> logger)
{
_dbContext = dbContext;
_lowStockDetector = lowStockDetector;
_logger = logger;
}
@@ -40,15 +47,18 @@ public class FilamentsController : ControllerBase
/// <response code="200">Returns the paginated list of filament spools.</response>
[HttpGet]
[ProducesResponseType(typeof(PagedResponse<FilamentResponse>), StatusCodes.Status200OK)]
[ProducesResponseType(StatusCodes.Status400BadRequest)]
public async Task<ActionResult<PagedResponse<FilamentResponse>>> GetFilaments(
[FromQuery] FilamentQueryParameters query)
{
_logger.LogDebug(
"Getting filaments: pageNumber={PageNumber}, pageSize={PageSize}, " +
"materialBaseId={MaterialBaseId}, materialFinishId={MaterialFinishId}, " +
"materialModifierId={MaterialModifierId}, brand={Brand}, isActive={IsActive}",
"materialModifierId={MaterialModifierId}, brand={Brand}, isActive={IsActive}, " +
"includeArchived={IncludeArchived}, storageLocation={StorageLocation}",
query.PageNumber, query.PageSize, query.MaterialBaseId,
query.MaterialFinishId, query.MaterialModifierId, query.Brand, query.IsActive);
query.MaterialFinishId, query.MaterialModifierId, query.Brand, query.IsActive,
query.IncludeArchived, query.StorageLocation);
// Clamp pagination values
var pageNumber = Math.Max(1, query.PageNumber);
@@ -77,13 +87,22 @@ public class FilamentsController : ControllerBase
if (query.IsActive.HasValue)
spoolQuery = spoolQuery.Where(s => s.IsActive == query.IsActive.Value);
// Exclude archived spools by default; include when explicitly requested
if (query.IncludeArchived != true)
spoolQuery = spoolQuery.Where(s => !s.IsArchived);
if (!string.IsNullOrWhiteSpace(query.StorageLocation))
spoolQuery = spoolQuery.Where(s =>
s.StorageLocation != null &&
s.StorageLocation.ToLower().Contains(query.StorageLocation.ToLower()));
var totalCount = await spoolQuery.CountAsync();
var items = await spoolQuery
.OrderByDescending(s => s.CreatedAt)
.Skip((pageNumber - 1) * pageSize)
.Take(pageSize)
.Select(s => MapToFilamentResponse(s))
.Select(s => MapToFilamentResponse(s, _lowStockDetector))
.ToListAsync();
var response = new PagedResponse<FilamentResponse>
@@ -124,7 +143,7 @@ public class FilamentsController : ControllerBase
return NotFound(new { error = $"Filament with ID '{id}' not found." });
}
return Ok(MapToFilamentResponse(spool));
return Ok(MapToFilamentResponse(spool, _lowStockDetector));
}
/// <summary>
@@ -185,7 +204,9 @@ public class FilamentsController : ControllerBase
SpoolSerial = request.SpoolSerial,
PurchasePrice = request.PurchasePrice,
PurchaseDate = request.PurchaseDate,
IsActive = request.IsActive
IsActive = request.IsActive,
IsArchived = request.IsArchived,
StorageLocation = request.StorageLocation
};
_dbContext.Spools.Add(entity);
@@ -197,7 +218,7 @@ public class FilamentsController : ControllerBase
if (entity.MaterialModifierId.HasValue)
await _dbContext.Entry(entity).Reference(s => s.MaterialModifier).LoadAsync();
var response = MapToFilamentResponse(entity);
var response = MapToFilamentResponse(entity, _lowStockDetector);
return CreatedAtAction(nameof(GetFilament), new { id = entity.Id }, response);
}
@@ -267,6 +288,8 @@ public class FilamentsController : ControllerBase
entity.PurchasePrice = request.PurchasePrice;
entity.PurchaseDate = request.PurchaseDate;
entity.IsActive = request.IsActive;
entity.IsArchived = request.IsArchived;
entity.StorageLocation = request.StorageLocation;
await _dbContext.SaveChangesAsync();
@@ -276,7 +299,97 @@ public class FilamentsController : ControllerBase
if (entity.MaterialModifierId.HasValue)
await _dbContext.Entry(entity).Reference(s => s.MaterialModifier).LoadAsync();
return Ok(MapToFilamentResponse(entity));
return Ok(MapToFilamentResponse(entity, _lowStockDetector));
}
/// <summary>
/// Gets only the filament spools that are flagged as low stock.
/// A spool is considered low stock when its remaining weight percentage
/// is at or below the configured threshold.
/// </summary>
/// <returns>A list of low-stock filament spools with alert metadata.</returns>
/// <response code="200">Returns the list of low-stock filament spools.</response>
[HttpGet("low-stock")]
[ProducesResponseType(typeof(List<FilamentResponse>), StatusCodes.Status200OK)]
public async Task<ActionResult<List<FilamentResponse>>> GetLowStockFilaments()
{
_logger.LogDebug("Getting low-stock filaments (threshold: {Threshold}%)",
_lowStockDetector.LowStockThresholdPercent);
var spools = await _dbContext.Spools
.Include(s => s.MaterialBase)
.Include(s => s.MaterialFinish)
.Include(s => s.MaterialModifier)
.Where(s => s.IsActive)
.OrderByDescending(s => s.CreatedAt)
.ToListAsync();
var lowStockItems = spools
.Where(s => _lowStockDetector.IsLowStock(s.WeightRemainingGrams, s.WeightTotalGrams))
.Select(s => MapToFilamentResponse(s, _lowStockDetector))
.ToList();
return Ok(lowStockItems);
}
/// <summary>
/// Deletes a filament spool by its unique identifier.
/// If the spool has associated print jobs, the deletion is rejected with a 409 Conflict
/// to preserve COGS and print history — the caller should archive the spool instead.
/// Associated filament usage records are removed before the spool is deleted.
/// AMS slots referencing this spool will have their SpoolId set to null by the database.
/// </summary>
/// <param name="id">The unique identifier of the filament spool to delete.</param>
/// <returns>No content on successful deletion.</returns>
/// <response code="204">The filament spool was successfully deleted.</response>
/// <response code="404">If the filament spool with the given ID is not found.</response>
/// <response code="409">If the spool has associated print jobs and cannot be deleted.</response>
[HttpDelete("{id:guid}")]
[ProducesResponseType(StatusCodes.Status204NoContent)]
[ProducesResponseType(StatusCodes.Status404NotFound)]
[ProducesResponseType(StatusCodes.Status409Conflict)]
public async Task<IActionResult> DeleteFilament(Guid id)
{
_logger.LogInformation("Deleting filament {Id}", id);
var entity = await _dbContext.Spools.FindAsync(id);
if (entity is null)
{
_logger.LogWarning("Filament {Id} not found for deletion", id);
return NotFound(new { error = $"Filament with ID '{id}' not found." });
}
// Check for associated print jobs — these cannot be orphaned
var hasPrintJobs = await _dbContext.PrintJobs.AnyAsync(pj => pj.SpoolId == id);
if (hasPrintJobs)
{
_logger.LogWarning(
"Cannot delete filament {Id}: associated print jobs exist. Suggest archiving instead.", id);
return Conflict(new
{
error = $"Cannot delete filament '{id}' because it has associated print jobs. " +
"Archive the filament instead to preserve print history and COGS data."
});
}
// Remove associated filament usage records (usage tracking data for this spool)
var usageRecords = await _dbContext.FilamentUsages
.Where(fu => fu.SpoolId == id)
.ToListAsync();
if (usageRecords.Count > 0)
{
_logger.LogInformation(
"Removing {Count} filament usage records for spool {Id}",
usageRecords.Count, id);
_dbContext.FilamentUsages.RemoveRange(usageRecords);
}
_dbContext.Spools.Remove(entity);
await _dbContext.SaveChangesAsync();
_logger.LogInformation("Filament {Id} deleted successfully", id);
return NoContent();
}
// ── Mapping helper ─────────────────────────────────────────
@@ -285,10 +398,12 @@ public class FilamentsController : ControllerBase
/// Maps a Spool domain entity to a FilamentResponse DTO.
/// Denormalizes material names for display convenience.
/// Populates the QrCodeUrl for easy frontend access to the spool's QR code.
/// Calculates low-stock status and remaining weight percentage.
/// </summary>
/// <param name="s">The spool entity to map.</param>
/// <returns>A FilamentResponse DTO with denormalized material names and QR code URL.</returns>
private static FilamentResponse MapToFilamentResponse(Spool s) => new()
/// <param name="lowStockDetector">The low-stock detection service for computing alert flags.</param>
/// <returns>A FilamentResponse DTO with denormalized material names, QR code URL, and low-stock metadata.</returns>
private static FilamentResponse MapToFilamentResponse(Spool s, ILowStockDetector lowStockDetector) => new()
{
Id = s.Id,
MaterialBaseId = s.MaterialBaseId,
@@ -307,8 +422,12 @@ public class FilamentsController : ControllerBase
PurchasePrice = s.PurchasePrice,
PurchaseDate = s.PurchaseDate,
IsActive = s.IsActive,
IsArchived = s.IsArchived,
StorageLocation = s.StorageLocation,
CreatedAt = s.CreatedAt,
UpdatedAt = s.UpdatedAt,
QrCodeUrl = $"/api/qr/spool/{s.Id}"
QrCodeUrl = $"/api/qr/spool/{s.Id}",
IsLowStock = lowStockDetector.IsLowStock(s.WeightRemainingGrams, s.WeightTotalGrams),
RemainingWeightPercent = lowStockDetector.GetRemainingWeightPercent(s.WeightRemainingGrams, s.WeightTotalGrams)
};
}

View File

@@ -413,6 +413,92 @@ public class PrintJobsController : ControllerBase
return NoContent();
}
// ── GET /api/printjobs/{id}/cost-summary ──────────────────────────
/// <summary>
/// Gets the material cost summary for a specific print job.
/// Calculates total material cost from filament usage (grams derived)
/// and the spool's purchase price. Returns warnings instead of errors
/// when cost data is unavailable.
/// </summary>
/// <param name="id">The unique identifier of the print job.</param>
/// <returns>A cost summary with breakdown and any warnings about missing data.</returns>
/// <response code="200">Returns the cost summary. Warnings field lists any missing data.</response>
/// <response code="404">If the print job with the given ID is not found.</response>
[HttpGet("{id:guid}/cost-summary")]
[ProducesResponseType(typeof(CostSummaryResponse), StatusCodes.Status200OK)]
[ProducesResponseType(StatusCodes.Status404NotFound)]
public async Task<ActionResult<CostSummaryResponse>> GetCostSummary(Guid id)
{
_logger.LogDebug("Getting cost summary for print job {Id}", id);
var job = await _dbContext.PrintJobs
.Include(j => j.Spool)
.ThenInclude(s => s!.MaterialBase)
.FirstOrDefaultAsync(j => j.Id == id);
if (job is null)
{
_logger.LogWarning("Print job {Id} not found for cost summary", id);
return NotFound(new { error = $"Print job with ID '{id}' not found." });
}
var warnings = new List<string>();
var spool = job.Spool;
// Build response with what we have
var response = new CostSummaryResponse
{
PrintJobId = job.Id,
PrintName = job.PrintName,
SpoolId = job.SpoolId,
SpoolSerial = spool?.SpoolSerial ?? string.Empty,
SpoolBrand = spool?.Brand ?? string.Empty,
SpoolColorName = spool?.ColorName ?? string.Empty,
MmExtruded = job.MmExtruded,
GramsDerived = job.GramsDerived,
SpoolPurchasePrice = spool?.PurchasePrice,
SpoolWeightTotalGrams = spool?.WeightTotalGrams,
StoredCostPerPrint = job.CostPerPrint
};
// Validate spool data availability
if (spool is null)
{
warnings.Add("Spool data is not available for this print job. Cost cannot be calculated.");
response.Warnings = warnings;
return Ok(response);
}
// Check if we can calculate cost
if (!spool.PurchasePrice.HasValue)
{
warnings.Add("Spool purchase price is not set. Cost per gram and total material cost cannot be calculated.");
}
if (spool.WeightTotalGrams <= 0)
{
warnings.Add("Spool total weight is zero or invalid. Cost per gram and total material cost cannot be calculated.");
}
// If we have enough data, calculate the cost
if (spool.PurchasePrice.HasValue && spool.WeightTotalGrams > 0)
{
var pricePerGram = spool.PurchasePrice.Value / spool.WeightTotalGrams;
response.PricePerGram = Math.Round(pricePerGram, 4);
response.TotalMaterialCost = Math.Round(job.GramsDerived * pricePerGram, 4);
}
// Warn if grams derived is zero but mm extruded is non-zero
if (job.GramsDerived == 0 && job.MmExtruded > 0)
{
warnings.Add("GramsDerived is zero despite MmExtruded being non-zero. Cost may be inaccurate. Consider re-deriving grams from filament parameters.");
}
response.Warnings = warnings;
return Ok(response);
}
// ── Gram Derivation Formula ────────────────────────────────────
/// <summary>

View File

@@ -0,0 +1,117 @@
using Extrudex.API.DTOs.UsageLogs;
using Extrudex.Domain.Enums;
using Extrudex.Domain.Interfaces;
using Microsoft.AspNetCore.Mvc;
namespace Extrudex.API.Controllers;
/// <summary>
/// API controller for recording and querying filament usage logs.
/// Usage logs provide a fine-grained audit trail of filament consumption
/// from printer integrations or manual input.
/// </summary>
[ApiController]
[Route("api/[controller]")]
[Produces("application/json")]
public class UsageLogsController : ControllerBase
{
private readonly IUsageLogService _usageLogService;
/// <summary>
/// Initializes a new instance of the <see cref="UsageLogsController"/> class.
/// </summary>
/// <param name="usageLogService">The usage log service for recording and querying usage.</param>
public UsageLogsController(IUsageLogService usageLogService)
{
_usageLogService = usageLogService;
}
/// <summary>
/// Records a new filament usage entry.
/// </summary>
/// <param name="request">The usage entry details.</param>
/// <returns>The created usage log entry.</returns>
[HttpPost]
[ProducesResponseType(typeof(UsageLogResponse), StatusCodes.Status201Created)]
[ProducesResponseType(StatusCodes.Status400BadRequest)]
public async Task<ActionResult<UsageLogResponse>> Create([FromBody] CreateUsageLogRequest request)
{
if (!Enum.TryParse<DataSource>(request.DataSource, ignoreCase: true, out var dataSource))
{
return BadRequest($"Invalid data source: '{request.DataSource}'. Valid values: Mqtt, Moonraker, Manual.");
}
var entry = await _usageLogService.RecordUsageAsync(
spoolId: request.SpoolId,
gramsUsed: request.GramsUsed,
dataSource: dataSource,
printerId: request.PrinterId,
printJobId: request.PrintJobId,
mmExtruded: request.MmExtruded,
usageTimestamp: request.UsageTimestamp,
notes: request.Notes
);
return CreatedAtAction(
nameof(GetBySpool),
new { spoolId = entry.SpoolId },
MapToResponse(entry));
}
/// <summary>
/// Gets usage logs for a specific spool, ordered by most recent first.
/// </summary>
/// <param name="spoolId">The spool ID to filter by.</param>
/// <returns>A collection of usage log entries for the spool.</returns>
[HttpGet("spool/{spoolId:guid}")]
[ProducesResponseType(typeof(IEnumerable<UsageLogResponse>), StatusCodes.Status200OK)]
public async Task<ActionResult<IEnumerable<UsageLogResponse>>> GetBySpool(Guid spoolId)
{
var logs = await _usageLogService.GetBySpoolAsync(spoolId);
return Ok(logs.Select(MapToResponse));
}
/// <summary>
/// Gets usage logs for a specific printer, ordered by most recent first.
/// </summary>
/// <param name="printerId">The printer ID to filter by.</param>
/// <returns>A collection of usage log entries for the printer.</returns>
[HttpGet("printer/{printerId:guid}")]
[ProducesResponseType(typeof(IEnumerable<UsageLogResponse>), StatusCodes.Status200OK)]
public async Task<ActionResult<IEnumerable<UsageLogResponse>>> GetByPrinter(Guid printerId)
{
var logs = await _usageLogService.GetByPrinterAsync(printerId);
return Ok(logs.Select(MapToResponse));
}
/// <summary>
/// Gets usage logs for a specific print job, ordered by most recent first.
/// </summary>
/// <param name="printJobId">The print job ID to filter by.</param>
/// <returns>A collection of usage log entries for the print job.</returns>
[HttpGet("print-job/{printJobId:guid}")]
[ProducesResponseType(typeof(IEnumerable<UsageLogResponse>), StatusCodes.Status200OK)]
public async Task<ActionResult<IEnumerable<UsageLogResponse>>> GetByPrintJob(Guid printJobId)
{
var logs = await _usageLogService.GetByPrintJobAsync(printJobId);
return Ok(logs.Select(MapToResponse));
}
/// <summary>
/// Maps a UsageLog domain entity to a UsageLogResponse DTO.
/// </summary>
private static UsageLogResponse MapToResponse(Domain.Entities.UsageLog log) => new()
{
Id = log.Id,
SpoolId = log.SpoolId,
PrinterId = log.PrinterId,
PrintJobId = log.PrintJobId,
GramsUsed = log.GramsUsed,
MmExtruded = log.MmExtruded,
UsageTimestamp = log.UsageTimestamp,
DataSource = log.DataSource.ToString(),
Notes = log.Notes,
CreatedAt = log.CreatedAt,
UpdatedAt = log.UpdatedAt
};
}

View File

@@ -59,6 +59,12 @@ public class FilamentResponse
/// <summary>Whether the spool is currently active and available.</summary>
public bool IsActive { get; set; }
/// <summary>Whether the spool has been archived (removed from active inventory).</summary>
public bool IsArchived { get; set; }
/// <summary>Physical storage location (e.g., "Shelf A", "Drawer 3"). Null if unset.</summary>
public string? StorageLocation { get; set; }
/// <summary>Timestamp when this record was created (UTC).</summary>
public DateTime CreatedAt { get; set; }
@@ -70,6 +76,19 @@ public class FilamentResponse
/// Encodes a deep link to the spool's detail page.
/// </summary>
public string QrCodeUrl { get; set; } = string.Empty;
/// <summary>
/// Whether this spool is flagged as low stock — remaining weight is at or
/// below the configured low-stock threshold percentage.
/// Useful for UI alerts and inventory dashboards.
/// </summary>
public bool IsLowStock { get; set; }
/// <summary>
/// Remaining filament weight as a percentage of total weight (0100).
/// Rounded to one decimal place. Returns 0 if total weight is zero.
/// </summary>
public decimal RemainingWeightPercent { get; set; }
}
/// <summary>
@@ -133,6 +152,15 @@ public class CreateFilamentRequest
/// <summary>Whether the spool is active. Defaults to true.</summary>
public bool IsActive { get; set; } = true;
/// <summary>Whether the spool is archived. Defaults to false.
/// </summary>
public bool IsArchived { get; set; } = false;
/// <summary>Physical storage location (e.g., "Shelf A", "Drawer 3"). Optional.
/// </summary>
[StringLength(200, ErrorMessage = "StorageLocation must not exceed 200 characters.")]
public string? StorageLocation { get; set; }
}
/// <summary>
@@ -196,4 +224,11 @@ public class UpdateFilamentRequest
/// <summary>Whether the spool is active.</summary>
public bool IsActive { get; set; } = true;
/// <summary>Whether the spool is archived. Defaults to false.</summary>
public bool IsArchived { get; set; } = false;
/// <summary>Physical storage location (e.g., "Shelf A", "Drawer 3"). Optional.</summary>
[StringLength(200, ErrorMessage = "StorageLocation must not exceed 200 characters.")]
public string? StorageLocation { get; set; }
}

View File

@@ -30,4 +30,11 @@ public class FilamentQueryParameters
/// <summary>Optional filter by active status. True = active only, False = inactive only.</summary>
public bool? IsActive { get; set; }
/// <summary>Whether to include archived spools in results. Defaults to false (excludes archived).
/// </summary>
public bool? IncludeArchived { get; set; }
/// <summary>Optional filter by storage location (case-insensitive partial match).</summary>
public string? StorageLocation { get; set; }
}

View File

@@ -0,0 +1,99 @@
using System.ComponentModel.DataAnnotations;
namespace Extrudex.API.DTOs.PrintJobs;
/// <summary>
/// Response DTO for cost-per-print calculation. Contains the full cost
/// breakdown and any warnings about missing or incomplete data.
/// </summary>
public class CostPerPrintResponse
{
/// <summary>The print job identifier this result belongs to.</summary>
public Guid PrintJobId { get; set; }
/// <summary>Human-readable name of the print job.</summary>
public string PrintName { get; set; } = string.Empty;
/// <summary>The spool identifier that provided filament.</summary>
public Guid SpoolId { get; set; }
/// <summary>Serial number of the spool.</summary>
public string SpoolSerial { get; set; } = string.Empty;
/// <summary>Total millimeters of filament extruded.</summary>
public decimal MmExtruded { get; set; }
/// <summary>Derived grams consumed for this print.</summary>
public decimal GramsDerived { get; set; }
/// <summary>The spool's purchase price. Null if not recorded.</summary>
public decimal? PurchasePrice { get; set; }
/// <summary>The spool's total weight in grams when full.</summary>
public decimal? WeightTotalGrams { get; set; }
/// <summary>Cost per gram of filament. Null if purchase price or total weight is missing.</summary>
public decimal? CostPerGram { get; set; }
/// <summary>Calculated cost of this print job. Null if cost data is incomplete.</summary>
public decimal? CostPerPrint { get; set; }
/// <summary>
/// Warnings about missing or incomplete data. Empty when all data is available
/// and the calculation succeeded.
/// </summary>
public List<string> Warnings { get; set; } = new();
}
/// <summary>
/// Request DTO for batch cost calculation by spool. Returns cost breakdowns
/// for all print jobs associated with the specified spool.
/// </summary>
public class SpoolCostRequest
{
/// <summary>The unique identifier of the spool to calculate costs for.</summary>
[Required(ErrorMessage = "SpoolId is required.")]
public Guid SpoolId { get; set; }
}
/// <summary>
/// Response DTO for spool-level cost calculation. Contains cost breakdowns
/// for all print jobs on the spool, plus a total cost summary.
/// </summary>
public class SpoolCostResponse
{
/// <summary>The spool identifier.</summary>
public Guid SpoolId { get; set; }
/// <summary>Serial number of the spool.</summary>
public string SpoolSerial { get; set; } = string.Empty;
/// <summary>The spool's purchase price. Null if not recorded.</summary>
public decimal? PurchasePrice { get; set; }
/// <summary>The spool's total weight in grams when full.</summary>
public decimal? WeightTotalGrams { get; set; }
/// <summary>Cost per gram of filament. Null if cost data is incomplete.</summary>
public decimal? CostPerGram { get; set; }
/// <summary>Total grams consumed across all print jobs on this spool.</summary>
public decimal TotalGramsConsumed { get; set; }
/// <summary>Total calculated cost across all print jobs. Null if any job has missing data.</summary>
public decimal? TotalCost { get; set; }
/// <summary>Number of print jobs included in this calculation.</summary>
public int JobCount { get; set; }
/// <summary>
/// Individual cost breakdowns per print job. Jobs with missing data
/// will have null cost fields and populated warnings.
/// </summary>
public List<CostPerPrintResponse> Jobs { get; set; } = new();
/// <summary>
/// Aggregate warnings about missing data across all jobs.
/// </summary>
public List<string> Warnings { get; set; } = new();
}

View File

@@ -0,0 +1,55 @@
namespace Extrudex.API.DTOs.PrintJobs;
/// <summary>
/// Response DTO for the cost summary of a print job.
/// Provides a breakdown of material cost based on filament usage
/// and spool pricing data. If cost data is incomplete, warnings
/// are returned instead of throwing an error.
/// </summary>
public class CostSummaryResponse
{
/// <summary>Unique identifier of the print job.</summary>
public Guid PrintJobId { get; set; }
/// <summary>Human-readable name of the print job.</summary>
public string PrintName { get; set; } = string.Empty;
/// <summary>Foreign key to the spool used for this print job.</summary>
public Guid SpoolId { get; set; }
/// <summary>Serial number of the spool.</summary>
public string SpoolSerial { get; set; } = string.Empty;
/// <summary>Brand of the spool.</summary>
public string SpoolBrand { get; set; } = string.Empty;
/// <summary>Color name of the spool.</summary>
public string SpoolColorName { get; set; } = string.Empty;
/// <summary>Total millimeters of filament extruded during this print.</summary>
public decimal MmExtruded { get; set; }
/// <summary>Derived grams consumed for this print job.</summary>
public decimal GramsDerived { get; set; }
/// <summary>Purchase price of the full spool, if available.</summary>
public decimal? SpoolPurchasePrice { get; set; }
/// <summary>Total weight of the spool in grams when full.</summary>
public decimal? SpoolWeightTotalGrams { get; set; }
/// <summary>Calculated price per gram (purchase price / total weight), if available.</summary>
public decimal? PricePerGram { get; set; }
/// <summary>Calculated total material cost for this print job, if available.</summary>
public decimal? TotalMaterialCost { get; set; }
/// <summary>The CostPerPrint stored on the print job entity, if set.</summary>
public decimal? StoredCostPerPrint { get; set; }
/// <summary>
/// Warnings about missing data that prevent cost calculation.
/// Empty if all data is available and cost was calculated successfully.
/// </summary>
public List<string> Warnings { get; set; } = new();
}

View File

@@ -0,0 +1,115 @@
using System.ComponentModel.DataAnnotations;
namespace Extrudex.API.DTOs.UsageLogs;
/// <summary>
/// Request DTO for recording a filament usage entry.
/// </summary>
public class CreateUsageLogRequest
{
/// <summary>
/// The ID of the spool that provided the filament.
/// </summary>
[Required]
public Guid SpoolId { get; set; }
/// <summary>
/// The number of grams of filament consumed.
/// </summary>
[Required]
[Range(0.01, double.MaxValue, ErrorMessage = "GramsUsed must be a positive value.")]
public decimal GramsUsed { get; set; }
/// <summary>
/// The source of the usage data (Mqtt, Moonraker, Manual).
/// </summary>
[Required]
public string DataSource { get; set; } = string.Empty;
/// <summary>
/// The ID of the printer that consumed the filament. Optional.
/// </summary>
public Guid? PrinterId { get; set; }
/// <summary>
/// The ID of the print job associated with this usage. Optional.
/// </summary>
public Guid? PrintJobId { get; set; }
/// <summary>
/// The number of millimeters of filament extruded. Optional.
/// </summary>
public decimal? MmExtruded { get; set; }
/// <summary>
/// When the usage occurred (UTC). Defaults to now if not specified.
/// </summary>
public DateTime? UsageTimestamp { get; set; }
/// <summary>
/// Optional notes about this usage entry.
/// </summary>
[MaxLength(2000)]
public string? Notes { get; set; }
}
/// <summary>
/// Response DTO for a usage log entry.
/// </summary>
public class UsageLogResponse
{
/// <summary>
/// Unique identifier for the usage log entry.
/// </summary>
public Guid Id { get; set; }
/// <summary>
/// The spool that provided the filament.
/// </summary>
public Guid SpoolId { get; set; }
/// <summary>
/// The printer that consumed the filament, if applicable.
/// </summary>
public Guid? PrinterId { get; set; }
/// <summary>
/// The print job associated with this usage, if applicable.
/// </summary>
public Guid? PrintJobId { get; set; }
/// <summary>
/// Grams of filament consumed.
/// </summary>
public decimal GramsUsed { get; set; }
/// <summary>
/// Millimeters of filament extruded, if available.
/// </summary>
public decimal? MmExtruded { get; set; }
/// <summary>
/// When the usage occurred (UTC).
/// </summary>
public DateTime UsageTimestamp { get; set; }
/// <summary>
/// Source of the usage data (Mqtt, Moonraker, Manual).
/// </summary>
public string DataSource { get; set; } = string.Empty;
/// <summary>
/// Optional notes about this usage entry.
/// </summary>
public string? Notes { get; set; }
/// <summary>
/// When the record was created (UTC).
/// </summary>
public DateTime CreatedAt { get; set; }
/// <summary>
/// When the record was last updated (UTC).
/// </summary>
public DateTime UpdatedAt { get; set; }
}

View File

@@ -0,0 +1,69 @@
using FluentValidation;
using Microsoft.AspNetCore.Mvc;
using Microsoft.AspNetCore.Mvc.Filters;
namespace Extrudex.API.Filters;
/// <summary>
/// Action filter that automatically validates request DTOs using FluentValidation
/// validators registered in DI. Runs before the controller action executes.
/// Returns 400 Bad Request with validation errors if validation fails.
/// </summary>
public class FluentValidationFilter : IAsyncActionFilter
{
private readonly IServiceProvider _serviceProvider;
private readonly ILogger<FluentValidationFilter> _logger;
public FluentValidationFilter(IServiceProvider serviceProvider, ILogger<FluentValidationFilter> logger)
{
_serviceProvider = serviceProvider;
_logger = logger;
}
public async Task OnActionExecutionAsync(ActionExecutingContext context, ActionExecutionDelegate next)
{
foreach (var argument in context.ActionArguments.Values)
{
if (argument is null) continue;
var argumentType = argument.GetType();
var validatorType = typeof(IValidator<>).MakeGenericType(argumentType);
// Try to resolve a validator for this argument type
var validator = _serviceProvider.GetService(validatorType) as IValidator;
if (validator is null) continue;
_logger.LogDebug("Validating {Type} with {Validator}", argumentType.Name, validator.GetType().Name);
var validationResult = await validator.ValidateAsync(
new ValidationContext<object>(argument), context.HttpContext.RequestAborted);
if (!validationResult.IsValid)
{
foreach (var error in validationResult.Errors)
{
context.ModelState.AddModelError(error.PropertyName, error.ErrorMessage);
}
}
}
if (!context.ModelState.IsValid)
{
var errors = context.ModelState
.Where(kvp => kvp.Value?.Errors.Count > 0)
.ToDictionary(
kvp => kvp.Key,
kvp => kvp.Value!.Errors.Select(e => e.ErrorMessage).ToArray());
context.Result = new BadRequestObjectResult(new
{
title = "Validation failed",
status = 400,
errors
});
return;
}
await next();
}
}

View File

@@ -0,0 +1,79 @@
using Extrudex.Domain.Interfaces;
using Extrudex.Infrastructure.Configuration;
using Microsoft.Extensions.Hosting;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
namespace Extrudex.API.Jobs;
/// <summary>
/// Background job that periodically syncs filament usage data from
/// Moonraker printers. Runs as a hosted service and polls all active
/// Moonraker printers on a configurable interval to persist usage
/// data to the Extrudex database.
///
/// Configuration is bound from the "FilamentUsageSync" section in
/// appsettings.json. Set Enabled=false to disable without removing
/// the service registration.
/// </summary>
public class FilamentUsageSyncJob : BackgroundService
{
private readonly IFilamentUsageSyncService _syncService;
private readonly FilamentUsageSyncOptions _options;
private readonly ILogger<FilamentUsageSyncJob> _logger;
/// <summary>
/// Creates a new FilamentUsageSyncJob.
/// </summary>
/// <param name="syncService">The service that performs the actual sync logic.</param>
/// <param name="options">Configuration options for polling interval and timeouts.</param>
/// <param name="logger">Logger for diagnostic output.</param>
public FilamentUsageSyncJob(
IFilamentUsageSyncService syncService,
IOptions<FilamentUsageSyncOptions> options,
ILogger<FilamentUsageSyncJob> logger)
{
_syncService = syncService;
_options = options.Value;
_logger = logger;
}
/// <inheritdoc />
protected override async Task ExecuteAsync(CancellationToken stoppingToken)
{
if (!_options.Enabled)
{
_logger.LogInformation("Filament usage sync job is disabled via configuration — exiting");
return;
}
_logger.LogInformation(
"Filament usage sync job starting — polling every {Interval}",
_options.PollingInterval);
// Delay briefly on startup to allow the web host to fully initialize
await Task.Delay(TimeSpan.FromSeconds(10), stoppingToken);
while (!stoppingToken.IsCancellationRequested)
{
try
{
var syncedCount = await _syncService.SyncAllAsync(stoppingToken);
_logger.LogInformation(
"Filament usage sync completed — {SyncedCount} printer(s) synced. Next sync in {Interval}",
syncedCount, _options.PollingInterval);
}
catch (Exception ex) when (ex is not OperationCanceledException)
{
_logger.LogError(ex,
"Error during filament usage sync cycle — will retry in {Interval}",
_options.PollingInterval);
}
await Task.Delay(_options.PollingInterval, stoppingToken);
}
_logger.LogInformation("Filament usage sync job shutting down");
}
}

View File

@@ -0,0 +1,80 @@
using Extrudex.Domain.Interfaces;
using Extrudex.Infrastructure.Configuration;
using Microsoft.Extensions.Hosting;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
namespace Extrudex.API.Jobs;
/// <summary>
/// Background service that periodically syncs Moonraker printer status
/// and print job history into the Extrudex database. Runs as a hosted
/// service and polls all active Moonraker printers on a configurable
/// interval to update printer state and map completed print jobs
/// to PrintJob and FilamentUsage entities.
///
/// Configuration is bound from the "MoonrakerPrinterSync" section in
/// appsettings.json. Set Enabled=false to disable without removing
/// the service registration.
/// </summary>
public class MoonrakerPrinterSyncJob : BackgroundService
{
private readonly IMoonrakerPrinterSyncService _syncService;
private readonly MoonrakerPrinterSyncOptions _options;
private readonly ILogger<MoonrakerPrinterSyncJob> _logger;
/// <summary>
/// Creates a new MoonrakerPrinterSyncJob.
/// </summary>
/// <param name="syncService">The service that performs the actual sync logic.</param>
/// <param name="options">Configuration options for polling interval and timeouts.</param>
/// <param name="logger">Logger for diagnostic output.</param>
public MoonrakerPrinterSyncJob(
IMoonrakerPrinterSyncService syncService,
IOptions<MoonrakerPrinterSyncOptions> options,
ILogger<MoonrakerPrinterSyncJob> logger)
{
_syncService = syncService;
_options = options.Value;
_logger = logger;
}
/// <inheritdoc />
protected override async Task ExecuteAsync(CancellationToken stoppingToken)
{
if (!_options.Enabled)
{
_logger.LogInformation("Moonraker printer sync job is disabled via configuration — exiting");
return;
}
_logger.LogInformation(
"Moonraker printer sync job starting — polling every {Interval}",
_options.PollingInterval);
// Delay briefly on startup to allow the web host to fully initialize
await Task.Delay(TimeSpan.FromSeconds(15), stoppingToken);
while (!stoppingToken.IsCancellationRequested)
{
try
{
var syncedCount = await _syncService.SyncAllAsync(stoppingToken);
_logger.LogInformation(
"Moonraker printer sync completed — {SyncedCount} printer(s) synced. Next sync in {Interval}",
syncedCount, _options.PollingInterval);
}
catch (Exception ex) when (ex is not OperationCanceledException)
{
_logger.LogError(ex,
"Error during Moonraker printer sync cycle — will retry in {Interval}",
_options.PollingInterval);
}
await Task.Delay(_options.PollingInterval, stoppingToken);
}
_logger.LogInformation("Moonraker printer sync job shutting down");
}
}

View File

@@ -0,0 +1,126 @@
using Extrudex.API.DTOs.Filaments;
using FluentValidation;
namespace Extrudex.API.Validators;
/// <summary>
/// Validation rules for creating a Filament (Spool) via the /filaments route.
/// Mirrors the domain rules enforced in the controller and ensures consistent
/// validation regardless of the request pipeline entry point.
/// </summary>
public class CreateFilamentRequestValidator : AbstractValidator<CreateFilamentRequest>
{
/// <summary>
/// Initializes validation rules for <see cref="CreateFilamentRequest"/>.
/// </summary>
public CreateFilamentRequestValidator()
{
RuleFor(x => x.MaterialBaseId)
.NotEmpty().WithMessage("MaterialBaseId is required.");
RuleFor(x => x.MaterialFinishId)
.NotEmpty().WithMessage("MaterialFinishId is required.");
RuleFor(x => x.Brand)
.NotEmpty().WithMessage("Brand is required.")
.MaximumLength(200).WithMessage("Brand must not exceed 200 characters.");
RuleFor(x => x.ColorName)
.NotEmpty().WithMessage("ColorName is required.")
.MaximumLength(200).WithMessage("ColorName must not exceed 200 characters.");
RuleFor(x => x.ColorHex)
.NotEmpty().WithMessage("ColorHex is required.")
.Matches(@"^#[0-9A-Fa-f]{6}$").WithMessage("ColorHex must be a valid hex color code (e.g., #FF0000).");
RuleFor(x => x.WeightTotalGrams)
.GreaterThan(0).WithMessage("Total weight must be greater than zero.");
RuleFor(x => x.WeightRemainingGrams)
.GreaterThanOrEqualTo(0).WithMessage("Remaining weight must be non-negative.");
RuleFor(x => x.WeightRemainingGrams)
.LessThanOrEqualTo(x => x.WeightTotalGrams)
.WithMessage("WeightRemainingGrams cannot exceed WeightTotalGrams.");
RuleFor(x => x.FilamentDiameterMm)
.GreaterThan(0).WithMessage("Filament diameter must be greater than zero.");
RuleFor(x => x.SpoolSerial)
.NotEmpty().WithMessage("SpoolSerial is required.")
.MaximumLength(200).WithMessage("SpoolSerial must not exceed 200 characters.");
When(x => x.PurchasePrice.HasValue, () =>
{
RuleFor(x => x.PurchasePrice!.Value)
.GreaterThanOrEqualTo(0).WithMessage("Purchase price must be non-negative.");
});
When(x => x.StorageLocation != null, () =>
{
RuleFor(x => x.StorageLocation!)
.MaximumLength(200).WithMessage("StorageLocation must not exceed 200 characters.");
});
}
}
/// <summary>
/// Validation rules for updating a Filament (Spool) via the /filaments route.
/// Enforces the same domain rules as creation, plus ensures the updated
/// WeightRemainingGrams does not exceed the updated WeightTotalGrams.
/// </summary>
public class UpdateFilamentRequestValidator : AbstractValidator<UpdateFilamentRequest>
{
/// <summary>
/// Initializes validation rules for <see cref="UpdateFilamentRequest"/>.
/// </summary>
public UpdateFilamentRequestValidator()
{
RuleFor(x => x.MaterialBaseId)
.NotEmpty().WithMessage("MaterialBaseId is required.");
RuleFor(x => x.MaterialFinishId)
.NotEmpty().WithMessage("MaterialFinishId is required.");
RuleFor(x => x.Brand)
.NotEmpty().WithMessage("Brand is required.")
.MaximumLength(200).WithMessage("Brand must not exceed 200 characters.");
RuleFor(x => x.ColorName)
.NotEmpty().WithMessage("ColorName is required.")
.MaximumLength(200).WithMessage("ColorName must not exceed 200 characters.");
RuleFor(x => x.ColorHex)
.NotEmpty().WithMessage("ColorHex is required.")
.Matches(@"^#[0-9A-Fa-f]{6}$").WithMessage("ColorHex must be a valid hex color code (e.g., #FF0000).");
RuleFor(x => x.WeightTotalGrams)
.GreaterThan(0).WithMessage("Total weight must be greater than zero.");
RuleFor(x => x.WeightRemainingGrams)
.GreaterThanOrEqualTo(0).WithMessage("Remaining weight must be non-negative.");
RuleFor(x => x.WeightRemainingGrams)
.LessThanOrEqualTo(x => x.WeightTotalGrams)
.WithMessage("WeightRemainingGrams cannot exceed WeightTotalGrams.");
RuleFor(x => x.FilamentDiameterMm)
.GreaterThan(0).WithMessage("Filament diameter must be greater than zero.");
RuleFor(x => x.SpoolSerial)
.NotEmpty().WithMessage("SpoolSerial is required.")
.MaximumLength(200).WithMessage("SpoolSerial must not exceed 200 characters.");
When(x => x.PurchasePrice.HasValue, () =>
{
RuleFor(x => x.PurchasePrice!.Value)
.GreaterThanOrEqualTo(0).WithMessage("Purchase price must be non-negative.");
});
When(x => x.StorageLocation != null, () =>
{
RuleFor(x => x.StorageLocation!)
.MaximumLength(200).WithMessage("StorageLocation must not exceed 200 characters.");
});
}
}

25
backend/Dockerfile Normal file
View File

@@ -0,0 +1,25 @@
# Build stage
FROM golang:1.24-alpine AS builder
WORKDIR /app
# Copy go mod files first for caching
COPY go.mod go.sum ./
RUN go mod download
# Copy source and build
COPY . .
RUN CGO_ENABLED=0 GOOS=linux go build -o server ./cmd/server
# Final stage
FROM alpine:latest
RUN apk --no-cache add ca-certificates
WORKDIR /root/
# Copy binary from builder
COPY --from=builder /app/server .
EXPOSE 8080
CMD ["./server"]

View File

@@ -0,0 +1,19 @@
namespace Extrudex.Domain.DTOs.Moonraker;
/// <summary>
/// Response DTO for Moonraker /printer/objects/query?display_status endpoint.
/// Contains progress percentage and message for the current print job.
/// Used by the SignalR hub to push real-time progress to connected clients.
/// </summary>
public class MoonrakerDisplayStatus
{
/// <summary>
/// Print progress as a decimal between 0 and 1 (0% to 100%).
/// </summary>
public decimal Progress { get; set; }
/// <summary>
/// Status message displayed on the printer LCD (e.g., "Printing...", "Heating...").
/// </summary>
public string Message { get; set; } = string.Empty;
}

View File

@@ -0,0 +1,20 @@
namespace Extrudex.Domain.DTOs.Moonraker;
/// <summary>
/// Response DTO for the Moonraker /server/history/items endpoint.
/// Wraps the paginated list of print job history items.
/// </summary>
public class MoonrakerHistoryResponse
{
/// <summary>
/// The list of print job history items returned by Moonraker.
/// Most recent jobs appear first (descending by start time).
/// </summary>
public List<MoonrakerPrintJob> Items { get; set; } = [];
/// <summary>
/// Total number of print jobs available on the server
/// (for pagination; the Items list may be a subset).
/// </summary>
public int TotalCount { get; set; }
}

View File

@@ -0,0 +1,56 @@
namespace Extrudex.Domain.DTOs.Moonraker;
/// <summary>
/// Response DTO for a single Moonraker print job history item.
/// Maps to the objects returned by /server/history/items.
/// Contains filament usage, duration, and status for a completed or active print.
/// </summary>
public class MoonrakerPrintJob
{
/// <summary>
/// Unique Moonraker job identifier (e.g., "000001").
/// </summary>
public string JobId { get; set; } = string.Empty;
/// <summary>
/// Filename of the G-code file that was printed.
/// </summary>
public string Filename { get; set; } = string.Empty;
/// <summary>
/// Current status of this print job: "completed", "cancelled", "error", "in_progress".
/// </summary>
public string Status { get; set; } = string.Empty;
/// <summary>
/// Total filament used in millimeters for this print job.
/// This is the primary measurement; grams are derived from this value.
/// </summary>
public decimal FilamentUsedMm { get; set; }
/// <summary>
/// Total print duration in seconds.
/// </summary>
public decimal PrintDurationSeconds { get; set; }
/// <summary>
/// Total print duration including setup and warmup, in seconds.
/// </summary>
public decimal TotalDurationSeconds { get; set; }
/// <summary>
/// Timestamp when the print job started (UTC).
/// </summary>
public DateTime? StartTime { get; set; }
/// <summary>
/// Timestamp when the print job ended (UTC). Null if still in progress.
/// </summary>
public DateTime? EndTime { get; set; }
/// <summary>
/// Metadata dictionary from Moonraker. May contain filament_type,
/// filament_name, nozzle_diameter, and other slicer-provided fields.
/// </summary>
public Dictionary<string, object> Metadata { get; set; } = new();
}

View File

@@ -0,0 +1,36 @@
namespace Extrudex.Domain.DTOs.Moonraker;
/// <summary>
/// Response DTO for Moonraker /printer/objects/query?print_stats endpoint.
/// Contains real-time print statistics including current job state,
/// filament consumed, and file being printed.
/// </summary>
public class MoonrakerPrintStats
{
/// <summary>
/// Current print state: "standby", "printing", "paused", "complete", "error", "cancelled".
/// </summary>
public string State { get; set; } = string.Empty;
/// <summary>
/// Total filament used in millimeters for the current print session.
/// </summary>
public decimal FilamentUsedMm { get; set; }
/// <summary>
/// Total print duration in seconds for the current print session.
/// </summary>
public decimal PrintDurationSeconds { get; set; }
/// <summary>
/// Filename of the G-code file currently being printed.
/// Null if no print is active.
/// </summary>
public string? Filename { get; set; }
/// <summary>
/// Detailed message from Klipper about the current print state.
/// May contain error details when state is "error".
/// </summary>
public string? Message { get; set; }
}

View File

@@ -0,0 +1,26 @@
namespace Extrudex.Domain.DTOs.Moonraker;
/// <summary>
/// Response DTO for the Moonraker /printer/info endpoint.
/// Contains the current operational state of the Klipper printer.
/// Used to determine whether the printer is idle, printing, paused, or in error.
/// </summary>
public class MoonrakerPrinterInfo
{
/// <summary>
/// Current Klipper state: "ready", "startup", "shutdown", "error", "cancelled".
/// A "ready" state means the printer is connected and idle.
/// </summary>
public string State { get; set; } = string.Empty;
/// <summary>
/// Detailed state message from Klipper. May contain error details
/// when the state is "error" or "shutdown".
/// </summary>
public string StateMessage { get; set; } = string.Empty;
/// <summary>
/// Whether the Klipper firmware is currently connected and responsive.
/// </summary>
public bool KlippyReady { get; set; }
}

View File

@@ -0,0 +1,25 @@
namespace Extrudex.Domain.DTOs.Moonraker;
/// <summary>
/// Request DTO for querying the Moonraker API.
/// Encapsulates the connection parameters needed to reach a specific
/// Moonraker instance on a Klipper-based printer.
/// </summary>
public class MoonrakerRequest
{
/// <summary>
/// Hostname or IP address of the Moonraker printer.
/// </summary>
public string HostnameOrIp { get; set; } = string.Empty;
/// <summary>
/// Port number for the Moonraker API. Default: 7125.
/// </summary>
public int Port { get; set; } = 7125;
/// <summary>
/// Optional API key for authenticating with Moonraker.
/// Required when the server has API key authentication enabled.
/// </summary>
public string? ApiKey { get; set; }
}

View File

@@ -0,0 +1,44 @@
namespace Extrudex.Domain.DTOs.Moonraker;
/// <summary>
/// Response DTO for the Moonraker /server/info endpoint.
/// Contains server identification and operational state.
/// Used to verify connectivity and determine Moonraker version.
/// </summary>
public class MoonrakerServerInfo
{
/// <summary>
/// The hostname of the Moonraker server (e.g., "mainsail").
/// </summary>
public string Hostname { get; set; } = string.Empty;
/// <summary>
/// Moonraker software version string (e.g., "0.8.0-89ee464").
/// </summary>
public string SoftwareVersion { get; set; } = string.Empty;
/// <summary>
/// CPU model string reported by the host system.
/// </summary>
public string CpuInfo { get; set; } = string.Empty;
/// <summary>
/// Whether Klipper is currently connected to the MCU.
/// </summary>
public bool KlippyConnected { get; set; }
/// <summary>
/// The current Klipper state (e.g., "ready", "startup", "error").
/// </summary>
public string KlippyState { get; set; } = string.Empty;
/// <summary>
/// Whether the Moonraker API requires an authentication token.
/// </summary>
public bool ApiKeyRequired { get; set; }
/// <summary>
/// List of registered Moonraker plugin names.
/// </summary>
public List<string> Plugins { get; set; } = [];
}

View File

@@ -0,0 +1,73 @@
using Extrudex.Domain.Base;
namespace Extrudex.Domain.Entities;
/// <summary>
/// Tracks filament consumption for a specific print job on a specific spool.
/// Each record captures the grams used, which printer consumed it, and when the
/// usage was recorded. This enables granular per-job usage analytics, COGS
/// reconciliation, and spool weight depletion tracking.
///
/// A single PrintJob may have multiple FilamentUsage records if multiple spools
/// were consumed (e.g., multi-material prints via AMS).
/// </summary>
public class FilamentUsage : AuditableEntity
{
/// <summary>
/// Foreign key to the print job that consumed this filament.
/// A usage record is always tied to a print job.
/// </summary>
public Guid PrintJobId { get; set; }
/// <summary>
/// Navigation to the print job that consumed this filament.
/// </summary>
public PrintJob PrintJob { get; set; } = null!;
/// <summary>
/// Foreign key to the spool (filament) that provided the material.
/// Links usage back to the specific physical spool for inventory tracking.
/// </summary>
public Guid SpoolId { get; set; }
/// <summary>
/// Navigation to the spool that provided the material.
/// </summary>
public Spool Spool { get; set; } = null!;
/// <summary>
/// Foreign key to the printer that executed the print job.
/// Denormalized from PrintJob for direct querying of per-printer usage.
/// </summary>
public Guid PrinterId { get; set; }
/// <summary>
/// Navigation to the printer that executed the print job.
/// </summary>
public Printer Printer { get; set; } = null!;
/// <summary>
/// Grams of filament consumed during this print job.
/// Derived from mm_extruded × cross_section_area × material_density,
/// or measured directly from AMS weight delta.
/// </summary>
public decimal GramsUsed { get; set; }
/// <summary>
/// Millimeters of filament extruded for this usage record.
/// The primary physical measurement; grams_used is derived from this.
/// </summary>
public decimal MmExtruded { get; set; }
/// <summary>
/// Timestamp when this usage record was created (UTC).
/// Represents when the usage was first logged, which may differ from
/// the print job's started_at or completed_at timestamps.
/// </summary>
public DateTime RecordedAt { get; set; } = DateTime.UtcNow;
/// <summary>
/// Optional notes about this usage record (e.g., "AMS tray 3", "manual weight check").
/// </summary>
public string? Notes { get; set; }
}

View File

@@ -97,4 +97,10 @@ public class PrintJob : AuditableEntity
/// Optional notes about the print job (e.g., "First layer adhesion issues").
/// </summary>
public string? Notes { get; set; }
/// <summary>
/// Navigation collection of filament usage records for this print job.
/// Enables tracking granular per-spool consumption within a print.
/// </summary>
public ICollection<FilamentUsage> FilamentUsages { get; set; } = new List<FilamentUsage>();
}

View File

@@ -94,4 +94,10 @@ public class Printer : AuditableEntity
/// Navigation collection of print jobs executed on this printer.
/// </summary>
public ICollection<PrintJob> PrintJobs { get; set; } = new List<PrintJob>();
/// <summary>
/// Navigation collection of filament usage records tracking consumption on this printer.
/// Enables querying per-printer filament usage and COGS.
/// </summary>
public ICollection<FilamentUsage> FilamentUsages { get; set; } = new List<FilamentUsage>();
}

View File

@@ -93,6 +93,20 @@ public class Spool : AuditableEntity
/// </summary>
public bool IsActive { get; set; } = true;
/// <summary>
/// Whether the spool has been archived (removed from active inventory).
/// Archived spools are retained for historical records but hidden from
/// default inventory views. Distinguishes long-term archival from
/// temporary inactivity (e.g., spool swapped out of AMS).
/// </summary>
public bool IsArchived { get; set; } = false;
/// <summary>
/// Physical storage location of the spool (e.g., "Shelf A", "Drawer 3", "AMS Tray 2").
/// Optional — not every spool has a designated storage location.
/// </summary>
public string? StorageLocation { get; set; }
/// <summary>
/// Navigation collection of AMS slots where this spool is loaded.
/// </summary>
@@ -102,4 +116,10 @@ public class Spool : AuditableEntity
/// Navigation collection of print jobs that consumed filament from this spool.
/// </summary>
public ICollection<PrintJob> PrintJobs { get; set; } = new List<PrintJob>();
/// <summary>
/// Navigation collection of filament usage records tracking consumption from this spool.
/// Enables querying how much filament was consumed per print job.
/// </summary>
public ICollection<FilamentUsage> FilamentUsages { get; set; } = new List<FilamentUsage>();
}

View File

@@ -0,0 +1,72 @@
using Extrudex.Domain.Base;
using Extrudex.Domain.Enums;
namespace Extrudex.Domain.Entities;
/// <summary>
/// Represents a single filament usage log entry. Records how much filament
/// was consumed, by which printer, at what time, and optionally linked to
/// a print job. This provides a fine-grained audit trail of filament consumption
/// independent of print job lifecycle.
/// </summary>
public class UsageLog : AuditableEntity
{
/// <summary>
/// Foreign key to the spool that provided the filament.
/// </summary>
public Guid SpoolId { get; set; }
/// <summary>
/// Navigation to the spool that provided the filament.
/// </summary>
public Spool Spool { get; set; } = null!;
/// <summary>
/// Foreign key to the printer that consumed the filament.
/// Nullable to support manual entries without a specific printer.
/// </summary>
public Guid? PrinterId { get; set; }
/// <summary>
/// Navigation to the printer that consumed the filament.
/// </summary>
public Printer? Printer { get; set; }
/// <summary>
/// Foreign key to the print job associated with this usage entry.
/// Nullable because usage can be logged before or without a print job.
/// </summary>
public Guid? PrintJobId { get; set; }
/// <summary>
/// Navigation to the print job associated with this usage entry.
/// </summary>
public PrintJob? PrintJob { get; set; }
/// <summary>
/// The number of grams of filament consumed in this usage event.
/// </summary>
public decimal GramsUsed { get; set; }
/// <summary>
/// The number of millimeters of filament extruded in this usage event.
/// Optional — may not be available for all data sources.
/// </summary>
public decimal? MmExtruded { get; set; }
/// <summary>
/// Timestamp when the usage occurred (UTC). This is the actual time of
/// consumption, which may differ from CreatedAt if the entry was recorded later.
/// </summary>
public DateTime UsageTimestamp { get; set; } = DateTime.UtcNow;
/// <summary>
/// The source of the usage data (which integration path provided it).
/// </summary>
public DataSource DataSource { get; set; } = DataSource.Manual;
/// <summary>
/// Optional notes about this usage entry.
/// </summary>
public string? Notes { get; set; }
}

View File

@@ -0,0 +1,76 @@
namespace Extrudex.Domain.Interfaces;
/// <summary>
/// Service interface for calculating the cost of goods sold (COGS) per print job.
/// Uses the spool's purchase price and the print job's derived grams consumed
/// to produce a cost breakdown. Handles missing cost data gracefully by returning
/// warnings rather than throwing exceptions.
/// </summary>
public interface ICostPerPrintService
{
/// <summary>
/// Calculates the cost per print for a specific print job.
/// </summary>
/// <param name="printJobId">The unique identifier of the print job.</param>
/// <param name="cancellationToken">Optional cancellation token.</param>
/// <returns>
/// A <see cref="CostPerPrintResult"/> containing the cost breakdown,
/// or warnings if cost data is missing or incomplete.
/// </returns>
Task<CostPerPrintResult> CalculateAsync(Guid printJobId, CancellationToken cancellationToken = default);
/// <summary>
/// Calculates cost breakdowns for all print jobs associated with a specific spool.
/// Useful for spool-level COGS reporting.
/// </summary>
/// <param name="spoolId">The unique identifier of the spool.</param>
/// <param name="cancellationToken">Optional cancellation token.</param>
/// <returns>
/// A list of <see cref="CostPerPrintResult"/> for each print job on the spool.
/// Jobs with missing cost data will include warnings.
/// </returns>
Task<IReadOnlyList<CostPerPrintResult>> CalculateBySpoolAsync(Guid spoolId, CancellationToken cancellationToken = default);
}
/// <summary>
/// Result of a cost-per-print calculation. Contains the cost breakdown
/// and any warnings about missing or incomplete cost data.
/// </summary>
public class CostPerPrintResult
{
/// <summary>The print job identifier this result belongs to.</summary>
public Guid PrintJobId { get; set; }
/// <summary>Human-readable name of the print job.</summary>
public string PrintName { get; set; } = string.Empty;
/// <summary>The spool identifier that provided filament.</summary>
public Guid SpoolId { get; set; }
/// <summary>Serial number of the spool.</summary>
public string SpoolSerial { get; set; } = string.Empty;
/// <summary>Total millimeters of filament extruded.</summary>
public decimal MmExtruded { get; set; }
/// <summary>Derived grams consumed for this print.</summary>
public decimal GramsDerived { get; set; }
/// <summary>The spool's purchase price. Null if not recorded.</summary>
public decimal? PurchasePrice { get; set; }
/// <summary>The spool's total weight in grams when full.</summary>
public decimal? WeightTotalGrams { get; set; }
/// <summary>Cost per gram of filament. Null if purchase price or total weight is missing.</summary>
public decimal? CostPerGram { get; set; }
/// <summary>Calculated cost of this print job. Null if cost data is incomplete.</summary>
public decimal? CostPerPrint { get; set; }
/// <summary>
/// Warnings about missing or incomplete data that prevented a full calculation.
/// Empty when all data is available and the calculation succeeded.
/// </summary>
public List<string> Warnings { get; set; } = new();
}

View File

@@ -0,0 +1,50 @@
using Extrudex.Domain.Entities;
namespace Extrudex.Domain.Interfaces;
/// <summary>
/// Service for persisting and querying filament usage records.
/// Tracks consumption per print job and per spool for COGS and inventory tracking.
/// </summary>
public interface IFilamentUsageService
{
/// <summary>
/// Records a new filament usage entry for a print job.
/// </summary>
/// <param name="printJobId">The print job that consumed the filament.</param>
/// <param name="spoolId">The spool that provided the filament.</param>
/// <param name="printerId">The printer that executed the print.</param>
/// <param name="gramsUsed">Grams of filament consumed.</param>
/// <param name="mmExtruded">Millimeters of filament extruded.</param>
/// <param name="notes">Optional notes about this usage record.</param>
/// <param name="cancellationToken">Cancellation token.</param>
/// <returns>The created FilamentUsage entity.</returns>
Task<FilamentUsage> RecordUsageAsync(
Guid printJobId,
Guid spoolId,
Guid printerId,
decimal gramsUsed,
decimal mmExtruded,
string? notes = null,
CancellationToken cancellationToken = default);
/// <summary>
/// Retrieves all filament usage records for a specific print job.
/// </summary>
/// <param name="printJobId">The print job ID.</param>
/// <param name="cancellationToken">Cancellation token.</param>
/// <returns>Collection of filament usage records for the print job.</returns>
Task<IReadOnlyList<FilamentUsage>> GetByPrintJobAsync(
Guid printJobId,
CancellationToken cancellationToken = default);
/// <summary>
/// Retrieves all filament usage records for a specific spool.
/// </summary>
/// <param name="spoolId">The spool ID.</param>
/// <param name="cancellationToken">Cancellation token.</param>
/// <returns>Collection of filament usage records for the spool.</returns>
Task<IReadOnlyList<FilamentUsage>> GetBySpoolAsync(
Guid spoolId,
CancellationToken cancellationToken = default);
}

View File

@@ -0,0 +1,19 @@
namespace Extrudex.Domain.Interfaces;
/// <summary>
/// Service interface for syncing filament usage data from printers
/// into the Extrudex database. Handles querying Moonraker printers,
/// computing derived usage metrics, and persisting updates to spools
/// and print job records.
/// </summary>
public interface IFilamentUsageSyncService
{
/// <summary>
/// Performs a single sync cycle: queries all active Moonraker printers,
/// fetches their current filament usage data, and persists updates to
/// the database.
/// </summary>
/// <param name="cancellationToken">Cancellation token for graceful shutdown.</param>
/// <returns>The number of printers successfully synced.</returns>
Task<int> SyncAllAsync(CancellationToken cancellationToken = default);
}

View File

@@ -0,0 +1,39 @@
namespace Extrudex.Domain.Interfaces;
/// <summary>
/// Detects low-stock filament spools based on configurable weight thresholds.
/// Determines whether a spool's remaining filament falls below a critical level
/// so that alerts and API flags can be surfaced to the user.
/// </summary>
public interface ILowStockDetector
{
/// <summary>
/// Determines whether a spool is considered low stock based on its remaining
/// weight relative to its total weight and the configured threshold percentage.
/// </summary>
/// <param name="weightRemainingGrams">The current remaining weight in grams.</param>
/// <param name="weightTotalGrams">The total spool weight in grams when full.</param>
/// <returns>
/// <c>true</c> if the remaining weight percentage is at or below the configured
/// low-stock threshold; <c>false</c> otherwise. Returns <c>false</c> for spools
/// with zero total weight to avoid division-by-zero.
/// </returns>
bool IsLowStock(decimal weightRemainingGrams, decimal weightTotalGrams);
/// <summary>
/// Calculates the remaining weight as a percentage of total weight.
/// </summary>
/// <param name="weightRemainingGrams">The current remaining weight in grams.</param>
/// <param name="weightTotalGrams">The total spool weight in grams when full.</param>
/// <returns>
/// A value between 0 and 100 representing the percentage of filament remaining.
/// Returns 0 if total weight is zero to avoid division-by-zero.
/// </returns>
decimal GetRemainingWeightPercent(decimal weightRemainingGrams, decimal weightTotalGrams);
/// <summary>
/// Gets the currently configured low-stock threshold percentage.
/// Useful for API responses so clients know what threshold is in effect.
/// </summary>
decimal LowStockThresholdPercent { get; }
}

View File

@@ -0,0 +1,131 @@
using Extrudex.Domain.DTOs.Moonraker;
namespace Extrudex.Domain.Interfaces;
/// <summary>
/// Client interface for communicating with Moonraker REST API endpoints
/// on Klipper-based printers (e.g., Elegoo Centauri Carbon).
/// Provides strongly-typed methods for server discovery, printer status,
/// print job history, and real-time telemetry.
/// </summary>
public interface IMoonrakerClient
{
/// <summary>
/// Checks whether the Moonraker server is reachable and responding.
/// Calls the /server/info endpoint and returns the server information
/// if successful, or null if the server is unreachable.
/// </summary>
/// <param name="hostnameOrIp">The printer's hostname or IP address.</param>
/// <param name="port">The Moonraker API port (default: 7125).</param>
/// <param name="apiKey">Optional API key for authentication.</param>
/// <param name="cancellationToken">Cancellation token for the HTTP request.</param>
/// <returns>Server info if reachable; <c>null</c> if unreachable.</returns>
Task<MoonrakerServerInfo?> GetServerInfoAsync(
string hostnameOrIp,
int port,
string? apiKey,
CancellationToken cancellationToken = default);
/// <summary>
/// Checks whether the Moonraker server is reachable and responding.
/// This is a convenience method equivalent to calling GetServerInfoAsync
/// and checking for a non-null result.
/// </summary>
/// <param name="hostnameOrIp">The printer's hostname or IP address.</param>
/// <param name="port">The Moonraker API port (default: 7125).</param>
/// <param name="apiKey">Optional API key for authentication.</param>
/// <param name="cancellationToken">Cancellation token for the HTTP request.</param>
/// <returns><c>true</c> if the server responded successfully; otherwise <c>false</c>.</returns>
Task<bool> IsReachableAsync(
string hostnameOrIp,
int port,
string? apiKey,
CancellationToken cancellationToken = default);
/// <summary>
/// Fetches the current printer info from the /printer/info endpoint.
/// Returns the Klipper state and readiness status.
/// </summary>
/// <param name="hostnameOrIp">The printer's hostname or IP address.</param>
/// <param name="port">The Moonraker API port (default: 7125).</param>
/// <param name="apiKey">Optional API key for authentication.</param>
/// <param name="cancellationToken">Cancellation token for the HTTP request.</param>
/// <returns>Printer info if successful; <c>null</c> if the request failed.</returns>
Task<MoonrakerPrinterInfo?> GetPrinterInfoAsync(
string hostnameOrIp,
int port,
string? apiKey,
CancellationToken cancellationToken = default);
/// <summary>
/// Fetches print job history from the /server/history/items endpoint.
/// Returns the most recent print jobs with filament usage data,
/// print duration, and completion status.
/// </summary>
/// <param name="hostnameOrIp">The printer's hostname or IP address.</param>
/// <param name="port">The Moonraker API port (default: 7125).</param>
/// <param name="apiKey">Optional API key for authentication.</param>
/// <param name="limit">Maximum number of history items to return. Default: 50.</param>
/// <param name="cancellationToken">Cancellation token for the HTTP request.</param>
/// <returns>History response with print jobs; empty list if request failed.</returns>
Task<MoonrakerHistoryResponse> GetPrintHistoryAsync(
string hostnameOrIp,
int port,
string? apiKey,
int limit = 50,
CancellationToken cancellationToken = default);
/// <summary>
/// Fetches the current print statistics from the /printer/objects/query endpoint.
/// Returns real-time data including filament used, print duration,
/// and current print state for the active or most recent print.
/// </summary>
/// <param name="hostnameOrIp">The printer's hostname or IP address.</param>
/// <param name="port">The Moonraker API port (default: 7125).</param>
/// <param name="apiKey">Optional API key for authentication.</param>
/// <param name="cancellationToken">Cancellation token for the HTTP request.</param>
/// <returns>Print stats if successful; <c>null</c> if the request failed.</returns>
Task<MoonrakerPrintStats?> GetPrintStatsAsync(
string hostnameOrIp,
int port,
string? apiKey,
CancellationToken cancellationToken = default);
/// <summary>
/// Fetches the current display status from the /printer/objects/query endpoint.
/// Returns progress percentage and status message for the active print.
/// Used by SignalR to push real-time progress updates to connected clients.
/// </summary>
/// <param name="hostnameOrIp">The printer's hostname or IP address.</param>
/// <param name="port">The Moonraker API port (default: 7125).</param>
/// <param name="apiKey">Optional API key for authentication.</param>
/// <param name="cancellationToken">Cancellation token for the HTTP request.</param>
/// <returns>Display status if successful; <c>null</c> if the request failed.</returns>
Task<MoonrakerDisplayStatus?> GetDisplayStatusAsync(
string hostnameOrIp,
int port,
string? apiKey,
CancellationToken cancellationToken = default);
/// <summary>
/// Fetches the current filament usage data from the Moonraker server.
/// Returns a dictionary of usage metrics reported by the printer.
///
/// <para>
/// <b>Prefer GetPrintHistoryAsync or GetPrintStatsAsync for new code.</b>
/// This method is retained for backward compatibility with the
/// FilamentUsageSyncService and returns a dictionary of metric names
/// to their decimal values for callers that don't need typed DTOs.
/// </para>
/// </summary>
/// <param name="hostnameOrIp">The printer's hostname or IP address.</param>
/// <param name="port">The Moonraker API port (default: 7125).</param>
/// <param name="apiKey">Optional API key for authentication.</param>
/// <param name="cancellationToken">Cancellation token for the HTTP request.</param>
/// <returns>A dictionary of usage metric names to their decimal values.</returns>
Task<Dictionary<string, decimal>> GetFilamentUsageAsync(
string hostnameOrIp,
int port,
string? apiKey,
CancellationToken cancellationToken = default);
}

View File

@@ -0,0 +1,20 @@
using Extrudex.Domain.DTOs.Moonraker;
namespace Extrudex.Domain.Interfaces;
/// <summary>
/// Service interface for syncing Moonraker printer data into the Extrudex database.
/// Handles periodic polling of printer status and mapping print job history
/// to PrintJob and FilamentUsage entities.
/// </summary>
public interface IMoonrakerPrinterSyncService
{
/// <summary>
/// Performs a single sync cycle: queries all active Moonraker printers,
/// fetches their current status and print job history, and persists
/// updates to the database.
/// </summary>
/// <param name="cancellationToken">Cancellation token for graceful shutdown.</param>
/// <returns>The number of printers successfully synced.</returns>
Task<int> SyncAllAsync(CancellationToken cancellationToken = default);
}

View File

@@ -0,0 +1,57 @@
using Extrudex.Domain.Entities;
using Extrudex.Domain.Enums;
namespace Extrudex.Domain.Interfaces;
/// <summary>
/// Service for recording filament usage entries. Writes to the usage_logs table
/// and provides query capabilities for usage history.
/// </summary>
public interface IUsageLogService
{
/// <summary>
/// Records a filament usage entry.
/// </summary>
/// <param name="spoolId">The spool that provided the filament.</param>
/// <param name="gramsUsed">Grams of filament consumed.</param>
/// <param name="dataSource">Where the data came from.</param>
/// <param name="printerId">Optional printer ID.</param>
/// <param name="printJobId">Optional print job ID.</param>
/// <param name="mmExtruded">Optional mm extruded.</param>
/// <param name="usageTimestamp">When the usage occurred (defaults to UTC now).</param>
/// <param name="notes">Optional notes.</param>
/// <returns>The created UsageLog entity.</returns>
Task<UsageLog> RecordUsageAsync(
Guid spoolId,
decimal gramsUsed,
DataSource dataSource,
Guid? printerId = null,
Guid? printJobId = null,
decimal? mmExtruded = null,
DateTime? usageTimestamp = null,
string? notes = null);
/// <summary>
/// Retrieves usage logs for a specific spool, ordered by usage timestamp descending.
/// </summary>
/// <param name="spoolId">The spool ID to filter by.</param>
/// <param name="cancellationToken">Cancellation token.</param>
/// <returns>A collection of usage logs for the spool.</returns>
Task<IEnumerable<UsageLog>> GetBySpoolAsync(Guid spoolId, CancellationToken cancellationToken = default);
/// <summary>
/// Retrieves usage logs for a specific printer, ordered by usage timestamp descending.
/// </summary>
/// <param name="printerId">The printer ID to filter by.</param>
/// <param name="cancellationToken">Cancellation token.</param>
/// <returns>A collection of usage logs for the printer.</returns>
Task<IEnumerable<UsageLog>> GetByPrinterAsync(Guid printerId, CancellationToken cancellationToken = default);
/// <summary>
/// Retrieves usage logs for a specific print job, ordered by usage timestamp descending.
/// </summary>
/// <param name="printJobId">The print job ID to filter by.</param>
/// <param name="cancellationToken">Cancellation token.</param>
/// <returns>A collection of usage logs for the print job.</returns>
Task<IEnumerable<UsageLog>> GetByPrintJobAsync(Guid printJobId, CancellationToken cancellationToken = default);
}

View File

@@ -9,6 +9,7 @@
</PropertyGroup>
<ItemGroup>
<PackageReference Include="AspNetCore.HealthChecks.NpgSql" Version="9.0.0" />
<PackageReference Include="FluentValidation.DependencyInjectionExtensions" Version="12.1.1" />
<PackageReference Include="Microsoft.EntityFrameworkCore" Version="9.0.3" />
<PackageReference Include="Microsoft.EntityFrameworkCore.Design" Version="9.0.3" />

View File

@@ -0,0 +1,33 @@
namespace Extrudex.Infrastructure.Configuration;
/// <summary>
/// Configuration options for the FilamentUsageSync background job.
/// Bound from appsettings.json under the "FilamentUsageSync" section.
/// Controls polling interval and per-printer timeout settings.
/// </summary>
public class FilamentUsageSyncOptions
{
/// <summary>
/// The section name in appsettings.json where these options are bound.
/// </summary>
public const string SectionName = "FilamentUsageSync";
/// <summary>
/// How often the background job polls printers for usage data.
/// Default: 5 minutes. Minimum recommended: 1 minute.
/// </summary>
public TimeSpan PollingInterval { get; set; } = TimeSpan.FromMinutes(5);
/// <summary>
/// Timeout for individual HTTP requests to a Moonraker printer.
/// Default: 30 seconds.
/// </summary>
public TimeSpan RequestTimeout { get; set; } = TimeSpan.FromSeconds(30);
/// <summary>
/// Whether the sync job is enabled. Set to false to disable
/// the background job without removing its registration.
/// Default: true.
/// </summary>
public bool Enabled { get; set; } = true;
}

View File

@@ -0,0 +1,41 @@
namespace Extrudex.Infrastructure.Configuration;
/// <summary>
/// Configuration options for the MoonrakerPrinterSync background service.
/// Bound from appsettings.json under the "MoonrakerPrinterSync" section.
/// Controls polling interval, timeouts, and feature toggles for the
/// printer status and print job mapping service.
/// </summary>
public class MoonrakerPrinterSyncOptions
{
/// <summary>
/// The section name in appsettings.json where these options are bound.
/// </summary>
public const string SectionName = "MoonrakerPrinterSync";
/// <summary>
/// How often the background service polls Moonraker printers for status
/// and print job data. Default: 1 minute.
/// </summary>
public TimeSpan PollingInterval { get; set; } = TimeSpan.FromMinutes(1);
/// <summary>
/// Timeout for individual HTTP requests to a Moonraker printer.
/// Default: 15 seconds.
/// </summary>
public TimeSpan RequestTimeout { get; set; } = TimeSpan.FromSeconds(15);
/// <summary>
/// Whether the Moonraker printer sync service is enabled.
/// Set to false to disable without removing the service registration.
/// Default: true.
/// </summary>
public bool Enabled { get; set; } = true;
/// <summary>
/// Maximum number of print history items to fetch per printer per sync cycle.
/// Controls the batch size when syncing print jobs from Moonraker.
/// Default: 25.
/// </summary>
public int HistoryBatchSize { get; set; } = 25;
}

View File

@@ -49,15 +49,26 @@ public abstract class BaseEntityConfiguration<TEntity> : IEntityTypeConfiguratio
}
/// <summary>
/// Converts PascalCase or camelCase to snake_case.
/// Converts PascalCase or camelCase entity name to plural snake_case table name.
/// e.g. MaterialBase → material_bases, AmsSlot → ams_slots
/// </summary>
protected static string ToSnakeCase(string name)
{
return string.Concat(
var snake = string.Concat(
name.Select((ch, i) =>
i > 0 && char.IsUpper(ch) && (char.IsLower(name[i - 1]) || (i + 1 < name.Length && char.IsLower(name[i + 1])))
? "_" + ch
: ch.ToString()))
.ToLowerInvariant();
// Pluralize: add 's' (handles most cases; irregular plurals handled explicitly if needed)
// Special cases: already_plural stays, 'y' → 'ies', 's'/'x'/'ch'/'sh' → 'es'
if (snake.EndsWith("s"))
return snake; // Already plural or ambiguous — leave as-is
if (snake.EndsWith("y") && !snake.EndsWith("ay") && !snake.EndsWith("ey") && !snake.EndsWith("oy") && !snake.EndsWith("uy"))
return snake[..^1] + "ies";
if (snake.EndsWith("x") || snake.EndsWith("ch") || snake.EndsWith("sh"))
return snake + "es";
return snake + "s";
}
}

View File

@@ -0,0 +1,83 @@
using Extrudex.Domain.Entities;
using Microsoft.EntityFrameworkCore;
using Microsoft.EntityFrameworkCore.Metadata.Builders;
namespace Extrudex.Infrastructure.Data.Configurations;
public class FilamentUsageConfiguration : BaseEntityConfiguration<FilamentUsage>
{
public override void Configure(EntityTypeBuilder<FilamentUsage> builder)
{
base.Configure(builder);
builder.Property(e => e.PrintJobId)
.HasColumnName("print_job_id")
.IsRequired();
builder.Property(e => e.SpoolId)
.HasColumnName("spool_id")
.IsRequired();
builder.Property(e => e.PrinterId)
.HasColumnName("printer_id")
.IsRequired();
builder.Property(e => e.GramsUsed)
.HasColumnName("grams_used")
.HasPrecision(10, 2)
.IsRequired();
builder.Property(e => e.MmExtruded)
.HasColumnName("mm_extruded")
.HasPrecision(12, 2)
.IsRequired();
builder.Property(e => e.RecordedAt)
.HasColumnName("recorded_at")
.HasDefaultValueSql("now() at time zone 'utc'")
.IsRequired();
builder.Property(e => e.Notes)
.HasColumnName("notes")
.HasMaxLength(2000);
// Index on print_job_id for querying usage by print job
builder.HasIndex(e => e.PrintJobId)
.HasDatabaseName("ix_filament_usages_print_job_id");
// Index on spool_id for querying usage by spool (filament)
builder.HasIndex(e => e.SpoolId)
.HasDatabaseName("ix_filament_usages_spool_id");
// Index on printer_id for querying usage by printer
builder.HasIndex(e => e.PrinterId)
.HasDatabaseName("ix_filament_usages_printer_id");
// Index on recorded_at for time-range queries
builder.HasIndex(e => e.RecordedAt)
.HasDatabaseName("ix_filament_usages_recorded_at");
// Composite index for querying usage by spool within a date range
builder.HasIndex(e => new { e.SpoolId, e.RecordedAt })
.HasDatabaseName("ix_filament_usages_spool_id_recorded_at");
// Relationships
builder.HasOne(e => e.PrintJob)
.WithMany(e => e.FilamentUsages)
.HasForeignKey(e => e.PrintJobId)
.HasConstraintName("fk_filament_usages_print_job")
.OnDelete(DeleteBehavior.Cascade);
builder.HasOne(e => e.Spool)
.WithMany(e => e.FilamentUsages)
.HasForeignKey(e => e.SpoolId)
.HasConstraintName("fk_filament_usages_spool")
.OnDelete(DeleteBehavior.Restrict);
builder.HasOne(e => e.Printer)
.WithMany(e => e.FilamentUsages)
.HasForeignKey(e => e.PrinterId)
.HasConstraintName("fk_filament_usages_printer")
.OnDelete(DeleteBehavior.Restrict);
}
}

View File

@@ -68,6 +68,15 @@ public class SpoolConfiguration : BaseEntityConfiguration<Spool>
.HasDefaultValue(true)
.IsRequired();
builder.Property(e => e.IsArchived)
.HasColumnName("is_archived")
.HasDefaultValue(false)
.IsRequired();
builder.Property(e => e.StorageLocation)
.HasColumnName("storage_location")
.HasMaxLength(200);
// Unique index on spool_serial — critical for barcode/QR scanning
builder.HasIndex(e => e.SpoolSerial)
.IsUnique()
@@ -77,10 +86,26 @@ public class SpoolConfiguration : BaseEntityConfiguration<Spool>
builder.HasIndex(e => e.MaterialBaseId)
.HasDatabaseName("ix_spools_material_base_id");
// Index on material_finish_id for spool filtering
builder.HasIndex(e => e.MaterialFinishId)
.HasDatabaseName("ix_spools_material_finish_id");
// Index on material_modifier_id for spool filtering
builder.HasIndex(e => e.MaterialModifierId)
.HasDatabaseName("ix_spools_material_modifier_id");
// Index on is_active for active spool queries
builder.HasIndex(e => e.IsActive)
.HasDatabaseName("ix_spools_is_active");
// Index on is_archived for inventory filtering (exclude archived from default views)
builder.HasIndex(e => e.IsArchived)
.HasDatabaseName("ix_spools_is_archived");
// Composite index on is_active + is_archived for common inventory queries
builder.HasIndex(e => new { e.IsActive, e.IsArchived })
.HasDatabaseName("ix_spools_active_archived");
// Relationships
builder.HasOne(e => e.MaterialBase)
.WithMany(e => e.Spools)

View File

@@ -0,0 +1,91 @@
using Extrudex.Domain.Entities;
using Extrudex.Domain.Enums;
using Microsoft.EntityFrameworkCore;
using Microsoft.EntityFrameworkCore.Metadata.Builders;
namespace Extrudex.Infrastructure.Data.Configurations;
/// <summary>
/// EF Core configuration for the UsageLog entity.
/// Maps to the usage_logs table with snake_case columns and appropriate indexes.
/// </summary>
public class UsageLogConfiguration : BaseEntityConfiguration<UsageLog>
{
/// <inheritdoc/>
public override void Configure(EntityTypeBuilder<UsageLog> builder)
{
base.Configure(builder);
builder.Property(e => e.SpoolId)
.HasColumnName("spool_id")
.IsRequired();
builder.Property(e => e.PrinterId)
.HasColumnName("printer_id");
builder.Property(e => e.PrintJobId)
.HasColumnName("print_job_id");
builder.Property(e => e.GramsUsed)
.HasColumnName("grams_used")
.HasPrecision(10, 2)
.IsRequired();
builder.Property(e => e.MmExtruded)
.HasColumnName("mm_extruded")
.HasPrecision(12, 2);
builder.Property(e => e.UsageTimestamp)
.HasColumnName("usage_timestamp")
.IsRequired();
builder.Property(e => e.DataSource)
.HasColumnName("data_source")
.HasConversion<string>()
.HasMaxLength(50)
.IsRequired();
builder.Property(e => e.Notes)
.HasColumnName("notes")
.HasMaxLength(2000);
// Index on spool_id for querying usage by spool
builder.HasIndex(e => e.SpoolId)
.HasDatabaseName("ix_usage_logs_spool_id");
// Index on printer_id for querying usage by printer
builder.HasIndex(e => e.PrinterId)
.HasDatabaseName("ix_usage_logs_printer_id");
// Index on print_job_id for querying usage by print job
builder.HasIndex(e => e.PrintJobId)
.HasDatabaseName("ix_usage_logs_print_job_id");
// Index on usage_timestamp for chronological queries
builder.HasIndex(e => e.UsageTimestamp)
.HasDatabaseName("ix_usage_logs_usage_timestamp");
// Index on data_source for filtering by integration path
builder.HasIndex(e => e.DataSource)
.HasDatabaseName("ix_usage_logs_data_source");
// Relationships
builder.HasOne(e => e.Spool)
.WithMany()
.HasForeignKey(e => e.SpoolId)
.HasConstraintName("fk_usage_logs_spool")
.OnDelete(DeleteBehavior.Restrict);
builder.HasOne(e => e.Printer)
.WithMany()
.HasForeignKey(e => e.PrinterId)
.HasConstraintName("fk_usage_logs_printer")
.OnDelete(DeleteBehavior.SetNull);
builder.HasOne(e => e.PrintJob)
.WithMany()
.HasForeignKey(e => e.PrintJobId)
.HasConstraintName("fk_usage_logs_print_job")
.OnDelete(DeleteBehavior.SetNull);
}
}

View File

@@ -23,6 +23,8 @@ public class ExtrudexDbContext : DbContext
public DbSet<AmsUnit> AmsUnits => Set<AmsUnit>();
public DbSet<AmsSlot> AmsSlots => Set<AmsSlot>();
public DbSet<PrintJob> PrintJobs => Set<PrintJob>();
public DbSet<FilamentUsage> FilamentUsages => Set<FilamentUsage>();
public DbSet<UsageLog> UsageLogs => Set<UsageLog>();
protected override void OnModelCreating(ModelBuilder modelBuilder)
{

View File

@@ -0,0 +1,958 @@
// <auto-generated />
using System;
using Extrudex.Infrastructure.Data;
using Microsoft.EntityFrameworkCore;
using Microsoft.EntityFrameworkCore.Infrastructure;
using Microsoft.EntityFrameworkCore.Migrations;
using Microsoft.EntityFrameworkCore.Storage.ValueConversion;
using Npgsql.EntityFrameworkCore.PostgreSQL.Metadata;
#nullable disable
namespace Extrudex.Infrastructure.Data.Migrations
{
[DbContext(typeof(ExtrudexDbContext))]
[Migration("20260426131419_InitialCreate")]
partial class InitialCreate
{
/// <inheritdoc />
protected override void BuildTargetModel(ModelBuilder modelBuilder)
{
#pragma warning disable 612, 618
modelBuilder
.HasAnnotation("ProductVersion", "9.0.3")
.HasAnnotation("Relational:MaxIdentifierLength", 63);
NpgsqlModelBuilderExtensions.UseIdentityByDefaultColumns(modelBuilder);
modelBuilder.Entity("Extrudex.Domain.Entities.AmsSlot", b =>
{
b.Property<Guid>("Id")
.HasColumnType("uuid")
.HasColumnName("id");
b.Property<Guid>("AmsUnitId")
.HasColumnType("uuid")
.HasColumnName("ams_unit_id");
b.Property<DateTime>("CreatedAt")
.ValueGeneratedOnAdd()
.HasColumnType("timestamp with time zone")
.HasColumnName("created_at")
.HasDefaultValueSql("now() at time zone 'utc'");
b.Property<decimal?>("RemainingWeightG")
.HasPrecision(10, 2)
.HasColumnType("numeric(10,2)")
.HasColumnName("remaining_weight_g");
b.Property<Guid?>("SpoolId")
.HasColumnType("uuid")
.HasColumnName("spool_id");
b.Property<int>("TrayIndex")
.HasColumnType("integer")
.HasColumnName("tray_index");
b.Property<DateTime>("UpdatedAt")
.ValueGeneratedOnAdd()
.HasColumnType("timestamp with time zone")
.HasColumnName("updated_at")
.HasDefaultValueSql("now() at time zone 'utc'");
b.HasKey("Id");
b.HasIndex("SpoolId")
.HasDatabaseName("ix_ams_slots_spool_id");
b.HasIndex("AmsUnitId", "TrayIndex")
.IsUnique()
.HasDatabaseName("ix_ams_slots_ams_unit_id_tray_index");
b.ToTable("ams_slots", (string)null);
});
modelBuilder.Entity("Extrudex.Domain.Entities.AmsUnit", b =>
{
b.Property<Guid>("Id")
.HasColumnType("uuid")
.HasColumnName("id");
b.Property<DateTime>("CreatedAt")
.ValueGeneratedOnAdd()
.HasColumnType("timestamp with time zone")
.HasColumnName("created_at")
.HasDefaultValueSql("now() at time zone 'utc'");
b.Property<Guid>("PrinterId")
.HasColumnType("uuid")
.HasColumnName("printer_id");
b.Property<int>("UnitIndex")
.HasColumnType("integer")
.HasColumnName("unit_index");
b.Property<DateTime>("UpdatedAt")
.ValueGeneratedOnAdd()
.HasColumnType("timestamp with time zone")
.HasColumnName("updated_at")
.HasDefaultValueSql("now() at time zone 'utc'");
b.HasKey("Id");
b.HasIndex("PrinterId", "UnitIndex")
.IsUnique()
.HasDatabaseName("ix_ams_units_printer_id_unit_index");
b.ToTable("ams_units", (string)null);
});
modelBuilder.Entity("Extrudex.Domain.Entities.MaterialBase", b =>
{
b.Property<Guid>("Id")
.HasColumnType("uuid")
.HasColumnName("id");
b.Property<DateTime>("CreatedAt")
.ValueGeneratedOnAdd()
.HasColumnType("timestamp with time zone")
.HasColumnName("created_at")
.HasDefaultValueSql("now() at time zone 'utc'");
b.Property<decimal>("DensityGperCm3")
.HasPrecision(10, 4)
.HasColumnType("numeric(10,4)")
.HasColumnName("density_g_per_cm3");
b.Property<string>("Name")
.IsRequired()
.HasMaxLength(100)
.HasColumnType("character varying(100)")
.HasColumnName("name");
b.Property<DateTime>("UpdatedAt")
.ValueGeneratedOnAdd()
.HasColumnType("timestamp with time zone")
.HasColumnName("updated_at")
.HasDefaultValueSql("now() at time zone 'utc'");
b.HasKey("Id");
b.HasIndex("Name")
.IsUnique()
.HasDatabaseName("ix_material_bases_name");
b.ToTable("material_bases", (string)null);
b.HasData(
new
{
Id = new Guid("10000000-0000-0000-0000-000000000001"),
CreatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1096),
DensityGperCm3 = 1.24m,
Name = "PLA",
UpdatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1096)
},
new
{
Id = new Guid("10000000-0000-0000-0000-000000000002"),
CreatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1620),
DensityGperCm3 = 1.27m,
Name = "PETG",
UpdatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1620)
},
new
{
Id = new Guid("10000000-0000-0000-0000-000000000003"),
CreatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1630),
DensityGperCm3 = 1.04m,
Name = "ABS",
UpdatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1630)
},
new
{
Id = new Guid("10000000-0000-0000-0000-000000000004"),
CreatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1638),
DensityGperCm3 = 1.07m,
Name = "ASA",
UpdatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1638)
},
new
{
Id = new Guid("10000000-0000-0000-0000-000000000005"),
CreatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1645),
DensityGperCm3 = 1.21m,
Name = "TPU",
UpdatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1645)
},
new
{
Id = new Guid("10000000-0000-0000-0000-000000000006"),
CreatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1651),
DensityGperCm3 = 1.14m,
Name = "Nylon",
UpdatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1652)
});
});
modelBuilder.Entity("Extrudex.Domain.Entities.MaterialFinish", b =>
{
b.Property<Guid>("Id")
.HasColumnType("uuid")
.HasColumnName("id");
b.Property<DateTime>("CreatedAt")
.ValueGeneratedOnAdd()
.HasColumnType("timestamp with time zone")
.HasColumnName("created_at")
.HasDefaultValueSql("now() at time zone 'utc'");
b.Property<Guid>("MaterialBaseId")
.HasColumnType("uuid")
.HasColumnName("material_base_id");
b.Property<string>("Name")
.IsRequired()
.HasMaxLength(100)
.HasColumnType("character varying(100)")
.HasColumnName("name");
b.Property<DateTime>("UpdatedAt")
.ValueGeneratedOnAdd()
.HasColumnType("timestamp with time zone")
.HasColumnName("updated_at")
.HasDefaultValueSql("now() at time zone 'utc'");
b.HasKey("Id");
b.HasIndex("MaterialBaseId", "Name")
.IsUnique()
.HasDatabaseName("ix_material_finishes_material_base_id_name");
b.ToTable("material_finishes", (string)null);
b.HasData(
new
{
Id = new Guid("20000000-0000-0000-0000-000000000001"),
CreatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1850),
MaterialBaseId = new Guid("10000000-0000-0000-0000-000000000001"),
Name = "Basic",
UpdatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1850)
},
new
{
Id = new Guid("20000000-0000-0000-0000-000000000002"),
CreatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2041),
MaterialBaseId = new Guid("10000000-0000-0000-0000-000000000001"),
Name = "Matte",
UpdatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2041)
},
new
{
Id = new Guid("20000000-0000-0000-0000-000000000003"),
CreatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2049),
MaterialBaseId = new Guid("10000000-0000-0000-0000-000000000001"),
Name = "Silk",
UpdatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2049)
},
new
{
Id = new Guid("20000000-0000-0000-0000-000000000004"),
CreatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2055),
MaterialBaseId = new Guid("10000000-0000-0000-0000-000000000001"),
Name = "Glitter",
UpdatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2056)
},
new
{
Id = new Guid("20000000-0000-0000-0000-000000000005"),
CreatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2062),
MaterialBaseId = new Guid("10000000-0000-0000-0000-000000000001"),
Name = "Marble",
UpdatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2062)
},
new
{
Id = new Guid("20000000-0000-0000-0000-000000000006"),
CreatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2068),
MaterialBaseId = new Guid("10000000-0000-0000-0000-000000000001"),
Name = "Sparkle",
UpdatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2068)
},
new
{
Id = new Guid("20000000-0000-0000-0000-000000000007"),
CreatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2075),
MaterialBaseId = new Guid("10000000-0000-0000-0000-000000000002"),
Name = "Basic",
UpdatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2075)
},
new
{
Id = new Guid("20000000-0000-0000-0000-000000000008"),
CreatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2081),
MaterialBaseId = new Guid("10000000-0000-0000-0000-000000000002"),
Name = "Matte",
UpdatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2081)
},
new
{
Id = new Guid("20000000-0000-0000-0000-000000000009"),
CreatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2100),
MaterialBaseId = new Guid("10000000-0000-0000-0000-000000000002"),
Name = "Silk",
UpdatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2100)
},
new
{
Id = new Guid("20000000-0000-0000-0000-000000000010"),
CreatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2107),
MaterialBaseId = new Guid("10000000-0000-0000-0000-000000000003"),
Name = "Basic",
UpdatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2107)
},
new
{
Id = new Guid("20000000-0000-0000-0000-000000000011"),
CreatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2113),
MaterialBaseId = new Guid("10000000-0000-0000-0000-000000000003"),
Name = "Matte",
UpdatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2113)
},
new
{
Id = new Guid("20000000-0000-0000-0000-000000000012"),
CreatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2120),
MaterialBaseId = new Guid("10000000-0000-0000-0000-000000000004"),
Name = "Basic",
UpdatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2120)
},
new
{
Id = new Guid("20000000-0000-0000-0000-000000000013"),
CreatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2126),
MaterialBaseId = new Guid("10000000-0000-0000-0000-000000000004"),
Name = "Matte",
UpdatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2126)
},
new
{
Id = new Guid("20000000-0000-0000-0000-000000000014"),
CreatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2132),
MaterialBaseId = new Guid("10000000-0000-0000-0000-000000000005"),
Name = "Basic",
UpdatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2133)
},
new
{
Id = new Guid("20000000-0000-0000-0000-000000000015"),
CreatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2139),
MaterialBaseId = new Guid("10000000-0000-0000-0000-000000000006"),
Name = "Basic",
UpdatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2139)
});
});
modelBuilder.Entity("Extrudex.Domain.Entities.MaterialModifier", b =>
{
b.Property<Guid>("Id")
.HasColumnType("uuid")
.HasColumnName("id");
b.Property<DateTime>("CreatedAt")
.ValueGeneratedOnAdd()
.HasColumnType("timestamp with time zone")
.HasColumnName("created_at")
.HasDefaultValueSql("now() at time zone 'utc'");
b.Property<Guid>("MaterialBaseId")
.HasColumnType("uuid")
.HasColumnName("material_base_id");
b.Property<string>("Name")
.IsRequired()
.HasMaxLength(100)
.HasColumnType("character varying(100)")
.HasColumnName("name");
b.Property<DateTime>("UpdatedAt")
.ValueGeneratedOnAdd()
.HasColumnType("timestamp with time zone")
.HasColumnName("updated_at")
.HasDefaultValueSql("now() at time zone 'utc'");
b.HasKey("Id");
b.HasIndex("MaterialBaseId", "Name")
.IsUnique()
.HasDatabaseName("ix_material_modifiers_material_base_id_name");
b.ToTable("material_modifiers", (string)null);
b.HasData(
new
{
Id = new Guid("30000000-0000-0000-0000-000000000001"),
CreatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2304),
MaterialBaseId = new Guid("10000000-0000-0000-0000-000000000001"),
Name = "Carbon Fiber",
UpdatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2304)
},
new
{
Id = new Guid("30000000-0000-0000-0000-000000000002"),
CreatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2463),
MaterialBaseId = new Guid("10000000-0000-0000-0000-000000000001"),
Name = "Glass Fiber",
UpdatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2463)
},
new
{
Id = new Guid("30000000-0000-0000-0000-000000000003"),
CreatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2471),
MaterialBaseId = new Guid("10000000-0000-0000-0000-000000000001"),
Name = "Wood Fill",
UpdatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2471)
},
new
{
Id = new Guid("30000000-0000-0000-0000-000000000004"),
CreatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2477),
MaterialBaseId = new Guid("10000000-0000-0000-0000-000000000001"),
Name = "Glow-in-the-Dark",
UpdatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2478)
},
new
{
Id = new Guid("30000000-0000-0000-0000-000000000005"),
CreatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2484),
MaterialBaseId = new Guid("10000000-0000-0000-0000-000000000002"),
Name = "Carbon Fiber",
UpdatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2484)
},
new
{
Id = new Guid("30000000-0000-0000-0000-000000000006"),
CreatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2490),
MaterialBaseId = new Guid("10000000-0000-0000-0000-000000000002"),
Name = "Glass Fiber",
UpdatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2491)
},
new
{
Id = new Guid("30000000-0000-0000-0000-000000000007"),
CreatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2497),
MaterialBaseId = new Guid("10000000-0000-0000-0000-000000000003"),
Name = "Carbon Fiber",
UpdatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2497)
},
new
{
Id = new Guid("30000000-0000-0000-0000-000000000008"),
CreatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2503),
MaterialBaseId = new Guid("10000000-0000-0000-0000-000000000003"),
Name = "Glass Fiber",
UpdatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2503)
},
new
{
Id = new Guid("30000000-0000-0000-0000-000000000009"),
CreatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2510),
MaterialBaseId = new Guid("10000000-0000-0000-0000-000000000004"),
Name = "Carbon Fiber",
UpdatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2510)
},
new
{
Id = new Guid("30000000-0000-0000-0000-000000000010"),
CreatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2516),
MaterialBaseId = new Guid("10000000-0000-0000-0000-000000000006"),
Name = "Carbon Fiber",
UpdatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2516)
},
new
{
Id = new Guid("30000000-0000-0000-0000-000000000011"),
CreatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2522),
MaterialBaseId = new Guid("10000000-0000-0000-0000-000000000006"),
Name = "Glass Fiber",
UpdatedAt = new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2523)
});
});
modelBuilder.Entity("Extrudex.Domain.Entities.PrintJob", b =>
{
b.Property<Guid>("Id")
.HasColumnType("uuid")
.HasColumnName("id");
b.Property<DateTime?>("CompletedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("completed_at");
b.Property<decimal?>("CostPerPrint")
.HasPrecision(10, 4)
.HasColumnType("numeric(10,4)")
.HasColumnName("cost_per_print");
b.Property<DateTime>("CreatedAt")
.ValueGeneratedOnAdd()
.HasColumnType("timestamp with time zone")
.HasColumnName("created_at")
.HasDefaultValueSql("now() at time zone 'utc'");
b.Property<string>("DataSource")
.IsRequired()
.HasMaxLength(50)
.HasColumnType("character varying(50)")
.HasColumnName("data_source");
b.Property<decimal>("FilamentDiameterAtPrintMm")
.HasPrecision(6, 3)
.HasColumnType("numeric(6,3)")
.HasColumnName("filament_diameter_at_print_mm");
b.Property<string>("GcodeFilePath")
.HasMaxLength(1000)
.HasColumnType("character varying(1000)")
.HasColumnName("gcode_file_path");
b.Property<decimal>("GramsDerived")
.HasPrecision(10, 2)
.HasColumnType("numeric(10,2)")
.HasColumnName("grams_derived");
b.Property<decimal>("MaterialDensityAtPrint")
.HasPrecision(10, 4)
.HasColumnType("numeric(10,4)")
.HasColumnName("material_density_at_print");
b.Property<decimal>("MmExtruded")
.HasPrecision(12, 2)
.HasColumnType("numeric(12,2)")
.HasColumnName("mm_extruded");
b.Property<string>("Notes")
.HasMaxLength(2000)
.HasColumnType("character varying(2000)")
.HasColumnName("notes");
b.Property<string>("PrintName")
.IsRequired()
.HasMaxLength(500)
.HasColumnType("character varying(500)")
.HasColumnName("print_name");
b.Property<Guid>("PrinterId")
.HasColumnType("uuid")
.HasColumnName("printer_id");
b.Property<Guid>("SpoolId")
.HasColumnType("uuid")
.HasColumnName("spool_id");
b.Property<DateTime?>("StartedAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("started_at");
b.Property<string>("Status")
.IsRequired()
.ValueGeneratedOnAdd()
.HasMaxLength(50)
.HasColumnType("character varying(50)")
.HasDefaultValue("Queued")
.HasColumnName("status");
b.Property<DateTime>("UpdatedAt")
.ValueGeneratedOnAdd()
.HasColumnType("timestamp with time zone")
.HasColumnName("updated_at")
.HasDefaultValueSql("now() at time zone 'utc'");
b.HasKey("Id");
b.HasIndex("DataSource")
.HasDatabaseName("ix_print_jobs_data_source");
b.HasIndex("PrinterId")
.HasDatabaseName("ix_print_jobs_printer_id");
b.HasIndex("SpoolId")
.HasDatabaseName("ix_print_jobs_spool_id");
b.HasIndex("Status")
.HasDatabaseName("ix_print_jobs_status");
b.ToTable("print_jobs", (string)null);
});
modelBuilder.Entity("Extrudex.Domain.Entities.Printer", b =>
{
b.Property<Guid>("Id")
.HasColumnType("uuid")
.HasColumnName("id");
b.Property<string>("ApiKey")
.IsRequired()
.HasMaxLength(500)
.HasColumnType("character varying(500)")
.HasColumnName("api_key");
b.Property<string>("ConnectionType")
.IsRequired()
.HasMaxLength(50)
.HasColumnType("character varying(50)")
.HasColumnName("connection_type");
b.Property<DateTime>("CreatedAt")
.ValueGeneratedOnAdd()
.HasColumnType("timestamp with time zone")
.HasColumnName("created_at")
.HasDefaultValueSql("now() at time zone 'utc'");
b.Property<string>("HostnameOrIp")
.IsRequired()
.HasMaxLength(255)
.HasColumnType("character varying(255)")
.HasColumnName("hostname_or_ip");
b.Property<bool>("IsActive")
.ValueGeneratedOnAdd()
.HasColumnType("boolean")
.HasDefaultValue(true)
.HasColumnName("is_active");
b.Property<DateTime?>("LastSeenAt")
.HasColumnType("timestamp with time zone")
.HasColumnName("last_seen_at");
b.Property<string>("Manufacturer")
.IsRequired()
.HasMaxLength(200)
.HasColumnType("character varying(200)")
.HasColumnName("manufacturer");
b.Property<string>("Model")
.IsRequired()
.HasMaxLength(200)
.HasColumnType("character varying(200)")
.HasColumnName("model");
b.Property<string>("MqttPassword")
.IsRequired()
.HasMaxLength(500)
.HasColumnType("character varying(500)")
.HasColumnName("mqtt_password");
b.Property<bool>("MqttUseTls")
.ValueGeneratedOnAdd()
.HasColumnType("boolean")
.HasDefaultValue(false)
.HasColumnName("mqtt_use_tls");
b.Property<string>("MqttUsername")
.IsRequired()
.HasMaxLength(200)
.HasColumnType("character varying(200)")
.HasColumnName("mqtt_username");
b.Property<string>("Name")
.IsRequired()
.HasMaxLength(200)
.HasColumnType("character varying(200)")
.HasColumnName("name");
b.Property<int>("Port")
.HasColumnType("integer")
.HasColumnName("port");
b.Property<string>("PrinterType")
.IsRequired()
.HasMaxLength(50)
.HasColumnType("character varying(50)")
.HasColumnName("printer_type");
b.Property<string>("Status")
.IsRequired()
.ValueGeneratedOnAdd()
.HasMaxLength(50)
.HasColumnType("character varying(50)")
.HasDefaultValue("Offline")
.HasColumnName("status");
b.Property<DateTime>("UpdatedAt")
.ValueGeneratedOnAdd()
.HasColumnType("timestamp with time zone")
.HasColumnName("updated_at")
.HasDefaultValueSql("now() at time zone 'utc'");
b.HasKey("Id");
b.HasIndex("ConnectionType")
.HasDatabaseName("ix_printers_connection_type");
b.HasIndex("IsActive")
.HasDatabaseName("ix_printers_is_active");
b.HasIndex("PrinterType")
.HasDatabaseName("ix_printers_printer_type");
b.HasIndex("Status")
.HasDatabaseName("ix_printers_status");
b.ToTable("printers", (string)null);
});
modelBuilder.Entity("Extrudex.Domain.Entities.Spool", b =>
{
b.Property<Guid>("Id")
.HasColumnType("uuid")
.HasColumnName("id");
b.Property<string>("Brand")
.IsRequired()
.HasMaxLength(200)
.HasColumnType("character varying(200)")
.HasColumnName("brand");
b.Property<string>("ColorHex")
.IsRequired()
.HasMaxLength(7)
.HasColumnType("character varying(7)")
.HasColumnName("color_hex");
b.Property<string>("ColorName")
.IsRequired()
.HasMaxLength(200)
.HasColumnType("character varying(200)")
.HasColumnName("color_name");
b.Property<DateTime>("CreatedAt")
.ValueGeneratedOnAdd()
.HasColumnType("timestamp with time zone")
.HasColumnName("created_at")
.HasDefaultValueSql("now() at time zone 'utc'");
b.Property<decimal>("FilamentDiameterMm")
.HasPrecision(6, 3)
.HasColumnType("numeric(6,3)")
.HasColumnName("filament_diameter_mm");
b.Property<bool>("IsActive")
.ValueGeneratedOnAdd()
.HasColumnType("boolean")
.HasDefaultValue(true)
.HasColumnName("is_active");
b.Property<Guid>("MaterialBaseId")
.HasColumnType("uuid")
.HasColumnName("material_base_id");
b.Property<Guid>("MaterialFinishId")
.HasColumnType("uuid")
.HasColumnName("material_finish_id");
b.Property<Guid?>("MaterialModifierId")
.HasColumnType("uuid")
.HasColumnName("material_modifier_id");
b.Property<DateTime?>("PurchaseDate")
.HasColumnType("timestamp with time zone")
.HasColumnName("purchase_date");
b.Property<decimal?>("PurchasePrice")
.HasPrecision(10, 2)
.HasColumnType("numeric(10,2)")
.HasColumnName("purchase_price");
b.Property<string>("SpoolSerial")
.IsRequired()
.HasMaxLength(200)
.HasColumnType("character varying(200)")
.HasColumnName("spool_serial");
b.Property<DateTime>("UpdatedAt")
.ValueGeneratedOnAdd()
.HasColumnType("timestamp with time zone")
.HasColumnName("updated_at")
.HasDefaultValueSql("now() at time zone 'utc'");
b.Property<decimal>("WeightRemainingGrams")
.HasPrecision(10, 2)
.HasColumnType("numeric(10,2)")
.HasColumnName("weight_remaining_grams");
b.Property<decimal>("WeightTotalGrams")
.HasPrecision(10, 2)
.HasColumnType("numeric(10,2)")
.HasColumnName("weight_total_grams");
b.HasKey("Id");
b.HasIndex("IsActive")
.HasDatabaseName("ix_spools_is_active");
b.HasIndex("MaterialBaseId")
.HasDatabaseName("ix_spools_material_base_id");
b.HasIndex("MaterialFinishId")
.HasDatabaseName("ix_spools_material_finish_id");
b.HasIndex("MaterialModifierId")
.HasDatabaseName("ix_spools_material_modifier_id");
b.HasIndex("SpoolSerial")
.IsUnique()
.HasDatabaseName("ix_spools_spool_serial");
b.ToTable("spools", (string)null);
});
modelBuilder.Entity("Extrudex.Domain.Entities.AmsSlot", b =>
{
b.HasOne("Extrudex.Domain.Entities.AmsUnit", "AmsUnit")
.WithMany("Slots")
.HasForeignKey("AmsUnitId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired()
.HasConstraintName("fk_ams_slots_ams_unit");
b.HasOne("Extrudex.Domain.Entities.Spool", "Spool")
.WithMany("AmsSlots")
.HasForeignKey("SpoolId")
.OnDelete(DeleteBehavior.SetNull)
.HasConstraintName("fk_ams_slots_spool");
b.Navigation("AmsUnit");
b.Navigation("Spool");
});
modelBuilder.Entity("Extrudex.Domain.Entities.AmsUnit", b =>
{
b.HasOne("Extrudex.Domain.Entities.Printer", "Printer")
.WithMany("AmsUnits")
.HasForeignKey("PrinterId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired()
.HasConstraintName("fk_ams_units_printer");
b.Navigation("Printer");
});
modelBuilder.Entity("Extrudex.Domain.Entities.MaterialFinish", b =>
{
b.HasOne("Extrudex.Domain.Entities.MaterialBase", "MaterialBase")
.WithMany("Finishes")
.HasForeignKey("MaterialBaseId")
.OnDelete(DeleteBehavior.Restrict)
.IsRequired()
.HasConstraintName("fk_material_finishes_material_base");
b.Navigation("MaterialBase");
});
modelBuilder.Entity("Extrudex.Domain.Entities.MaterialModifier", b =>
{
b.HasOne("Extrudex.Domain.Entities.MaterialBase", "MaterialBase")
.WithMany("Modifiers")
.HasForeignKey("MaterialBaseId")
.OnDelete(DeleteBehavior.Restrict)
.IsRequired()
.HasConstraintName("fk_material_modifiers_material_base");
b.Navigation("MaterialBase");
});
modelBuilder.Entity("Extrudex.Domain.Entities.PrintJob", b =>
{
b.HasOne("Extrudex.Domain.Entities.Printer", "Printer")
.WithMany("PrintJobs")
.HasForeignKey("PrinterId")
.OnDelete(DeleteBehavior.Restrict)
.IsRequired()
.HasConstraintName("fk_print_jobs_printer");
b.HasOne("Extrudex.Domain.Entities.Spool", "Spool")
.WithMany("PrintJobs")
.HasForeignKey("SpoolId")
.OnDelete(DeleteBehavior.Restrict)
.IsRequired()
.HasConstraintName("fk_print_jobs_spool");
b.Navigation("Printer");
b.Navigation("Spool");
});
modelBuilder.Entity("Extrudex.Domain.Entities.Spool", b =>
{
b.HasOne("Extrudex.Domain.Entities.MaterialBase", "MaterialBase")
.WithMany("Spools")
.HasForeignKey("MaterialBaseId")
.OnDelete(DeleteBehavior.Restrict)
.IsRequired()
.HasConstraintName("fk_spools_material_base");
b.HasOne("Extrudex.Domain.Entities.MaterialFinish", "MaterialFinish")
.WithMany("Spools")
.HasForeignKey("MaterialFinishId")
.OnDelete(DeleteBehavior.Restrict)
.IsRequired()
.HasConstraintName("fk_spools_material_finish");
b.HasOne("Extrudex.Domain.Entities.MaterialModifier", "MaterialModifier")
.WithMany("Spools")
.HasForeignKey("MaterialModifierId")
.OnDelete(DeleteBehavior.SetNull)
.HasConstraintName("fk_spools_material_modifier");
b.Navigation("MaterialBase");
b.Navigation("MaterialFinish");
b.Navigation("MaterialModifier");
});
modelBuilder.Entity("Extrudex.Domain.Entities.AmsUnit", b =>
{
b.Navigation("Slots");
});
modelBuilder.Entity("Extrudex.Domain.Entities.MaterialBase", b =>
{
b.Navigation("Finishes");
b.Navigation("Modifiers");
b.Navigation("Spools");
});
modelBuilder.Entity("Extrudex.Domain.Entities.MaterialFinish", b =>
{
b.Navigation("Spools");
});
modelBuilder.Entity("Extrudex.Domain.Entities.MaterialModifier", b =>
{
b.Navigation("Spools");
});
modelBuilder.Entity("Extrudex.Domain.Entities.Printer", b =>
{
b.Navigation("AmsUnits");
b.Navigation("PrintJobs");
});
modelBuilder.Entity("Extrudex.Domain.Entities.Spool", b =>
{
b.Navigation("AmsSlots");
b.Navigation("PrintJobs");
});
#pragma warning restore 612, 618
}
}
}

View File

@@ -0,0 +1,416 @@
using System;
using Microsoft.EntityFrameworkCore.Migrations;
#nullable disable
#pragma warning disable CA1814 // Prefer jagged arrays over multidimensional
namespace Extrudex.Infrastructure.Data.Migrations
{
/// <inheritdoc />
public partial class InitialCreate : Migration
{
/// <inheritdoc />
protected override void Up(MigrationBuilder migrationBuilder)
{
migrationBuilder.CreateTable(
name: "material_bases",
columns: table => new
{
id = table.Column<Guid>(type: "uuid", nullable: false),
name = table.Column<string>(type: "character varying(100)", maxLength: 100, nullable: false),
density_g_per_cm3 = table.Column<decimal>(type: "numeric(10,4)", precision: 10, scale: 4, nullable: false),
created_at = table.Column<DateTime>(type: "timestamp with time zone", nullable: false, defaultValueSql: "now() at time zone 'utc'"),
updated_at = table.Column<DateTime>(type: "timestamp with time zone", nullable: false, defaultValueSql: "now() at time zone 'utc'")
},
constraints: table =>
{
table.PrimaryKey("PK_material_bases", x => x.id);
});
migrationBuilder.CreateTable(
name: "printers",
columns: table => new
{
id = table.Column<Guid>(type: "uuid", nullable: false),
status = table.Column<string>(type: "character varying(50)", maxLength: 50, nullable: false, defaultValue: "Offline"),
name = table.Column<string>(type: "character varying(200)", maxLength: 200, nullable: false),
manufacturer = table.Column<string>(type: "character varying(200)", maxLength: 200, nullable: false),
model = table.Column<string>(type: "character varying(200)", maxLength: 200, nullable: false),
printer_type = table.Column<string>(type: "character varying(50)", maxLength: 50, nullable: false),
connection_type = table.Column<string>(type: "character varying(50)", maxLength: 50, nullable: false),
hostname_or_ip = table.Column<string>(type: "character varying(255)", maxLength: 255, nullable: false),
port = table.Column<int>(type: "integer", nullable: false),
mqtt_username = table.Column<string>(type: "character varying(200)", maxLength: 200, nullable: false),
mqtt_password = table.Column<string>(type: "character varying(500)", maxLength: 500, nullable: false),
mqtt_use_tls = table.Column<bool>(type: "boolean", nullable: false, defaultValue: false),
api_key = table.Column<string>(type: "character varying(500)", maxLength: 500, nullable: false),
is_active = table.Column<bool>(type: "boolean", nullable: false, defaultValue: true),
last_seen_at = table.Column<DateTime>(type: "timestamp with time zone", nullable: true),
created_at = table.Column<DateTime>(type: "timestamp with time zone", nullable: false, defaultValueSql: "now() at time zone 'utc'"),
updated_at = table.Column<DateTime>(type: "timestamp with time zone", nullable: false, defaultValueSql: "now() at time zone 'utc'")
},
constraints: table =>
{
table.PrimaryKey("PK_printers", x => x.id);
});
migrationBuilder.CreateTable(
name: "material_finishes",
columns: table => new
{
id = table.Column<Guid>(type: "uuid", nullable: false),
name = table.Column<string>(type: "character varying(100)", maxLength: 100, nullable: false),
material_base_id = table.Column<Guid>(type: "uuid", nullable: false),
created_at = table.Column<DateTime>(type: "timestamp with time zone", nullable: false, defaultValueSql: "now() at time zone 'utc'"),
updated_at = table.Column<DateTime>(type: "timestamp with time zone", nullable: false, defaultValueSql: "now() at time zone 'utc'")
},
constraints: table =>
{
table.PrimaryKey("PK_material_finishes", x => x.id);
table.ForeignKey(
name: "fk_material_finishes_material_base",
column: x => x.material_base_id,
principalTable: "material_bases",
principalColumn: "id",
onDelete: ReferentialAction.Restrict);
});
migrationBuilder.CreateTable(
name: "material_modifiers",
columns: table => new
{
id = table.Column<Guid>(type: "uuid", nullable: false),
name = table.Column<string>(type: "character varying(100)", maxLength: 100, nullable: false),
material_base_id = table.Column<Guid>(type: "uuid", nullable: false),
created_at = table.Column<DateTime>(type: "timestamp with time zone", nullable: false, defaultValueSql: "now() at time zone 'utc'"),
updated_at = table.Column<DateTime>(type: "timestamp with time zone", nullable: false, defaultValueSql: "now() at time zone 'utc'")
},
constraints: table =>
{
table.PrimaryKey("PK_material_modifiers", x => x.id);
table.ForeignKey(
name: "fk_material_modifiers_material_base",
column: x => x.material_base_id,
principalTable: "material_bases",
principalColumn: "id",
onDelete: ReferentialAction.Restrict);
});
migrationBuilder.CreateTable(
name: "ams_units",
columns: table => new
{
id = table.Column<Guid>(type: "uuid", nullable: false),
unit_index = table.Column<int>(type: "integer", nullable: false),
printer_id = table.Column<Guid>(type: "uuid", nullable: false),
created_at = table.Column<DateTime>(type: "timestamp with time zone", nullable: false, defaultValueSql: "now() at time zone 'utc'"),
updated_at = table.Column<DateTime>(type: "timestamp with time zone", nullable: false, defaultValueSql: "now() at time zone 'utc'")
},
constraints: table =>
{
table.PrimaryKey("PK_ams_units", x => x.id);
table.ForeignKey(
name: "fk_ams_units_printer",
column: x => x.printer_id,
principalTable: "printers",
principalColumn: "id",
onDelete: ReferentialAction.Cascade);
});
migrationBuilder.CreateTable(
name: "spools",
columns: table => new
{
id = table.Column<Guid>(type: "uuid", nullable: false),
material_base_id = table.Column<Guid>(type: "uuid", nullable: false),
material_finish_id = table.Column<Guid>(type: "uuid", nullable: false),
material_modifier_id = table.Column<Guid>(type: "uuid", nullable: true),
brand = table.Column<string>(type: "character varying(200)", maxLength: 200, nullable: false),
color_name = table.Column<string>(type: "character varying(200)", maxLength: 200, nullable: false),
color_hex = table.Column<string>(type: "character varying(7)", maxLength: 7, nullable: false),
weight_total_grams = table.Column<decimal>(type: "numeric(10,2)", precision: 10, scale: 2, nullable: false),
weight_remaining_grams = table.Column<decimal>(type: "numeric(10,2)", precision: 10, scale: 2, nullable: false),
filament_diameter_mm = table.Column<decimal>(type: "numeric(6,3)", precision: 6, scale: 3, nullable: false),
spool_serial = table.Column<string>(type: "character varying(200)", maxLength: 200, nullable: false),
purchase_price = table.Column<decimal>(type: "numeric(10,2)", precision: 10, scale: 2, nullable: true),
purchase_date = table.Column<DateTime>(type: "timestamp with time zone", nullable: true),
is_active = table.Column<bool>(type: "boolean", nullable: false, defaultValue: true),
created_at = table.Column<DateTime>(type: "timestamp with time zone", nullable: false, defaultValueSql: "now() at time zone 'utc'"),
updated_at = table.Column<DateTime>(type: "timestamp with time zone", nullable: false, defaultValueSql: "now() at time zone 'utc'")
},
constraints: table =>
{
table.PrimaryKey("PK_spools", x => x.id);
table.ForeignKey(
name: "fk_spools_material_base",
column: x => x.material_base_id,
principalTable: "material_bases",
principalColumn: "id",
onDelete: ReferentialAction.Restrict);
table.ForeignKey(
name: "fk_spools_material_finish",
column: x => x.material_finish_id,
principalTable: "material_finishes",
principalColumn: "id",
onDelete: ReferentialAction.Restrict);
table.ForeignKey(
name: "fk_spools_material_modifier",
column: x => x.material_modifier_id,
principalTable: "material_modifiers",
principalColumn: "id",
onDelete: ReferentialAction.SetNull);
});
migrationBuilder.CreateTable(
name: "ams_slots",
columns: table => new
{
id = table.Column<Guid>(type: "uuid", nullable: false),
tray_index = table.Column<int>(type: "integer", nullable: false),
ams_unit_id = table.Column<Guid>(type: "uuid", nullable: false),
spool_id = table.Column<Guid>(type: "uuid", nullable: true),
remaining_weight_g = table.Column<decimal>(type: "numeric(10,2)", precision: 10, scale: 2, nullable: true),
created_at = table.Column<DateTime>(type: "timestamp with time zone", nullable: false, defaultValueSql: "now() at time zone 'utc'"),
updated_at = table.Column<DateTime>(type: "timestamp with time zone", nullable: false, defaultValueSql: "now() at time zone 'utc'")
},
constraints: table =>
{
table.PrimaryKey("PK_ams_slots", x => x.id);
table.ForeignKey(
name: "fk_ams_slots_ams_unit",
column: x => x.ams_unit_id,
principalTable: "ams_units",
principalColumn: "id",
onDelete: ReferentialAction.Cascade);
table.ForeignKey(
name: "fk_ams_slots_spool",
column: x => x.spool_id,
principalTable: "spools",
principalColumn: "id",
onDelete: ReferentialAction.SetNull);
});
migrationBuilder.CreateTable(
name: "print_jobs",
columns: table => new
{
id = table.Column<Guid>(type: "uuid", nullable: false),
printer_id = table.Column<Guid>(type: "uuid", nullable: false),
spool_id = table.Column<Guid>(type: "uuid", nullable: false),
print_name = table.Column<string>(type: "character varying(500)", maxLength: 500, nullable: false),
gcode_file_path = table.Column<string>(type: "character varying(1000)", maxLength: 1000, nullable: true),
mm_extruded = table.Column<decimal>(type: "numeric(12,2)", precision: 12, scale: 2, nullable: false),
grams_derived = table.Column<decimal>(type: "numeric(10,2)", precision: 10, scale: 2, nullable: false),
cost_per_print = table.Column<decimal>(type: "numeric(10,4)", precision: 10, scale: 4, nullable: true),
started_at = table.Column<DateTime>(type: "timestamp with time zone", nullable: true),
completed_at = table.Column<DateTime>(type: "timestamp with time zone", nullable: true),
status = table.Column<string>(type: "character varying(50)", maxLength: 50, nullable: false, defaultValue: "Queued"),
data_source = table.Column<string>(type: "character varying(50)", maxLength: 50, nullable: false),
filament_diameter_at_print_mm = table.Column<decimal>(type: "numeric(6,3)", precision: 6, scale: 3, nullable: false),
material_density_at_print = table.Column<decimal>(type: "numeric(10,4)", precision: 10, scale: 4, nullable: false),
notes = table.Column<string>(type: "character varying(2000)", maxLength: 2000, nullable: true),
created_at = table.Column<DateTime>(type: "timestamp with time zone", nullable: false, defaultValueSql: "now() at time zone 'utc'"),
updated_at = table.Column<DateTime>(type: "timestamp with time zone", nullable: false, defaultValueSql: "now() at time zone 'utc'")
},
constraints: table =>
{
table.PrimaryKey("PK_print_jobs", x => x.id);
table.ForeignKey(
name: "fk_print_jobs_printer",
column: x => x.printer_id,
principalTable: "printers",
principalColumn: "id",
onDelete: ReferentialAction.Restrict);
table.ForeignKey(
name: "fk_print_jobs_spool",
column: x => x.spool_id,
principalTable: "spools",
principalColumn: "id",
onDelete: ReferentialAction.Restrict);
});
migrationBuilder.InsertData(
table: "material_bases",
columns: new[] { "id", "created_at", "density_g_per_cm3", "name", "updated_at" },
values: new object[,]
{
{ new Guid("10000000-0000-0000-0000-000000000001"), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1096), 1.24m, "PLA", new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1096) },
{ new Guid("10000000-0000-0000-0000-000000000002"), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1620), 1.27m, "PETG", new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1620) },
{ new Guid("10000000-0000-0000-0000-000000000003"), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1630), 1.04m, "ABS", new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1630) },
{ new Guid("10000000-0000-0000-0000-000000000004"), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1638), 1.07m, "ASA", new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1638) },
{ new Guid("10000000-0000-0000-0000-000000000005"), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1645), 1.21m, "TPU", new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1645) },
{ new Guid("10000000-0000-0000-0000-000000000006"), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1651), 1.14m, "Nylon", new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1652) }
});
migrationBuilder.InsertData(
table: "material_finishes",
columns: new[] { "id", "created_at", "material_base_id", "name", "updated_at" },
values: new object[,]
{
{ new Guid("20000000-0000-0000-0000-000000000001"), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1850), new Guid("10000000-0000-0000-0000-000000000001"), "Basic", new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1850) },
{ new Guid("20000000-0000-0000-0000-000000000002"), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2041), new Guid("10000000-0000-0000-0000-000000000001"), "Matte", new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2041) },
{ new Guid("20000000-0000-0000-0000-000000000003"), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2049), new Guid("10000000-0000-0000-0000-000000000001"), "Silk", new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2049) },
{ new Guid("20000000-0000-0000-0000-000000000004"), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2055), new Guid("10000000-0000-0000-0000-000000000001"), "Glitter", new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2056) },
{ new Guid("20000000-0000-0000-0000-000000000005"), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2062), new Guid("10000000-0000-0000-0000-000000000001"), "Marble", new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2062) },
{ new Guid("20000000-0000-0000-0000-000000000006"), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2068), new Guid("10000000-0000-0000-0000-000000000001"), "Sparkle", new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2068) },
{ new Guid("20000000-0000-0000-0000-000000000007"), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2075), new Guid("10000000-0000-0000-0000-000000000002"), "Basic", new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2075) },
{ new Guid("20000000-0000-0000-0000-000000000008"), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2081), new Guid("10000000-0000-0000-0000-000000000002"), "Matte", new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2081) },
{ new Guid("20000000-0000-0000-0000-000000000009"), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2100), new Guid("10000000-0000-0000-0000-000000000002"), "Silk", new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2100) },
{ new Guid("20000000-0000-0000-0000-000000000010"), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2107), new Guid("10000000-0000-0000-0000-000000000003"), "Basic", new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2107) },
{ new Guid("20000000-0000-0000-0000-000000000011"), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2113), new Guid("10000000-0000-0000-0000-000000000003"), "Matte", new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2113) },
{ new Guid("20000000-0000-0000-0000-000000000012"), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2120), new Guid("10000000-0000-0000-0000-000000000004"), "Basic", new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2120) },
{ new Guid("20000000-0000-0000-0000-000000000013"), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2126), new Guid("10000000-0000-0000-0000-000000000004"), "Matte", new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2126) },
{ new Guid("20000000-0000-0000-0000-000000000014"), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2132), new Guid("10000000-0000-0000-0000-000000000005"), "Basic", new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2133) },
{ new Guid("20000000-0000-0000-0000-000000000015"), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2139), new Guid("10000000-0000-0000-0000-000000000006"), "Basic", new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2139) }
});
migrationBuilder.InsertData(
table: "material_modifiers",
columns: new[] { "id", "created_at", "material_base_id", "name", "updated_at" },
values: new object[,]
{
{ new Guid("30000000-0000-0000-0000-000000000001"), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2304), new Guid("10000000-0000-0000-0000-000000000001"), "Carbon Fiber", new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2304) },
{ new Guid("30000000-0000-0000-0000-000000000002"), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2463), new Guid("10000000-0000-0000-0000-000000000001"), "Glass Fiber", new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2463) },
{ new Guid("30000000-0000-0000-0000-000000000003"), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2471), new Guid("10000000-0000-0000-0000-000000000001"), "Wood Fill", new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2471) },
{ new Guid("30000000-0000-0000-0000-000000000004"), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2477), new Guid("10000000-0000-0000-0000-000000000001"), "Glow-in-the-Dark", new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2478) },
{ new Guid("30000000-0000-0000-0000-000000000005"), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2484), new Guid("10000000-0000-0000-0000-000000000002"), "Carbon Fiber", new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2484) },
{ new Guid("30000000-0000-0000-0000-000000000006"), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2490), new Guid("10000000-0000-0000-0000-000000000002"), "Glass Fiber", new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2491) },
{ new Guid("30000000-0000-0000-0000-000000000007"), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2497), new Guid("10000000-0000-0000-0000-000000000003"), "Carbon Fiber", new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2497) },
{ new Guid("30000000-0000-0000-0000-000000000008"), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2503), new Guid("10000000-0000-0000-0000-000000000003"), "Glass Fiber", new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2503) },
{ new Guid("30000000-0000-0000-0000-000000000009"), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2510), new Guid("10000000-0000-0000-0000-000000000004"), "Carbon Fiber", new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2510) },
{ new Guid("30000000-0000-0000-0000-000000000010"), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2516), new Guid("10000000-0000-0000-0000-000000000006"), "Carbon Fiber", new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2516) },
{ new Guid("30000000-0000-0000-0000-000000000011"), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2522), new Guid("10000000-0000-0000-0000-000000000006"), "Glass Fiber", new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2523) }
});
migrationBuilder.CreateIndex(
name: "ix_ams_slots_ams_unit_id_tray_index",
table: "ams_slots",
columns: new[] { "ams_unit_id", "tray_index" },
unique: true);
migrationBuilder.CreateIndex(
name: "ix_ams_slots_spool_id",
table: "ams_slots",
column: "spool_id");
migrationBuilder.CreateIndex(
name: "ix_ams_units_printer_id_unit_index",
table: "ams_units",
columns: new[] { "printer_id", "unit_index" },
unique: true);
migrationBuilder.CreateIndex(
name: "ix_material_bases_name",
table: "material_bases",
column: "name",
unique: true);
migrationBuilder.CreateIndex(
name: "ix_material_finishes_material_base_id_name",
table: "material_finishes",
columns: new[] { "material_base_id", "name" },
unique: true);
migrationBuilder.CreateIndex(
name: "ix_material_modifiers_material_base_id_name",
table: "material_modifiers",
columns: new[] { "material_base_id", "name" },
unique: true);
migrationBuilder.CreateIndex(
name: "ix_print_jobs_data_source",
table: "print_jobs",
column: "data_source");
migrationBuilder.CreateIndex(
name: "ix_print_jobs_printer_id",
table: "print_jobs",
column: "printer_id");
migrationBuilder.CreateIndex(
name: "ix_print_jobs_spool_id",
table: "print_jobs",
column: "spool_id");
migrationBuilder.CreateIndex(
name: "ix_print_jobs_status",
table: "print_jobs",
column: "status");
migrationBuilder.CreateIndex(
name: "ix_printers_connection_type",
table: "printers",
column: "connection_type");
migrationBuilder.CreateIndex(
name: "ix_printers_is_active",
table: "printers",
column: "is_active");
migrationBuilder.CreateIndex(
name: "ix_printers_printer_type",
table: "printers",
column: "printer_type");
migrationBuilder.CreateIndex(
name: "ix_printers_status",
table: "printers",
column: "status");
migrationBuilder.CreateIndex(
name: "ix_spools_is_active",
table: "spools",
column: "is_active");
migrationBuilder.CreateIndex(
name: "ix_spools_material_base_id",
table: "spools",
column: "material_base_id");
migrationBuilder.CreateIndex(
name: "ix_spools_material_finish_id",
table: "spools",
column: "material_finish_id");
migrationBuilder.CreateIndex(
name: "ix_spools_material_modifier_id",
table: "spools",
column: "material_modifier_id");
migrationBuilder.CreateIndex(
name: "ix_spools_spool_serial",
table: "spools",
column: "spool_serial",
unique: true);
}
/// <inheritdoc />
protected override void Down(MigrationBuilder migrationBuilder)
{
migrationBuilder.DropTable(
name: "ams_slots");
migrationBuilder.DropTable(
name: "print_jobs");
migrationBuilder.DropTable(
name: "ams_units");
migrationBuilder.DropTable(
name: "spools");
migrationBuilder.DropTable(
name: "printers");
migrationBuilder.DropTable(
name: "material_finishes");
migrationBuilder.DropTable(
name: "material_modifiers");
migrationBuilder.DropTable(
name: "material_bases");
}
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,533 @@
using System;
using Microsoft.EntityFrameworkCore.Migrations;
#nullable disable
namespace Extrudex.Infrastructure.Data.Migrations
{
/// <inheritdoc />
public partial class AddFilamentUsageTrackingModel : Migration
{
/// <inheritdoc />
protected override void Up(MigrationBuilder migrationBuilder)
{
migrationBuilder.CreateTable(
name: "filament_usages",
columns: table => new
{
id = table.Column<Guid>(type: "uuid", nullable: false),
print_job_id = table.Column<Guid>(type: "uuid", nullable: false),
spool_id = table.Column<Guid>(type: "uuid", nullable: false),
printer_id = table.Column<Guid>(type: "uuid", nullable: false),
grams_used = table.Column<decimal>(type: "numeric(10,2)", precision: 10, scale: 2, nullable: false),
mm_extruded = table.Column<decimal>(type: "numeric(12,2)", precision: 12, scale: 2, nullable: false),
recorded_at = table.Column<DateTime>(type: "timestamp with time zone", nullable: false, defaultValueSql: "now() at time zone 'utc'"),
notes = table.Column<string>(type: "character varying(2000)", maxLength: 2000, nullable: true),
created_at = table.Column<DateTime>(type: "timestamp with time zone", nullable: false, defaultValueSql: "now() at time zone 'utc'"),
updated_at = table.Column<DateTime>(type: "timestamp with time zone", nullable: false, defaultValueSql: "now() at time zone 'utc'")
},
constraints: table =>
{
table.PrimaryKey("PK_filament_usages", x => x.id);
table.ForeignKey(
name: "fk_filament_usages_print_job",
column: x => x.print_job_id,
principalTable: "print_jobs",
principalColumn: "id",
onDelete: ReferentialAction.Cascade);
table.ForeignKey(
name: "fk_filament_usages_printer",
column: x => x.printer_id,
principalTable: "printers",
principalColumn: "id",
onDelete: ReferentialAction.Restrict);
table.ForeignKey(
name: "fk_filament_usages_spool",
column: x => x.spool_id,
principalTable: "spools",
principalColumn: "id",
onDelete: ReferentialAction.Restrict);
});
migrationBuilder.UpdateData(
table: "material_bases",
keyColumn: "id",
keyValue: new Guid("10000000-0000-0000-0000-000000000001"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 34, 33, 291, DateTimeKind.Utc).AddTicks(9388), new DateTime(2026, 4, 26, 18, 34, 33, 291, DateTimeKind.Utc).AddTicks(9388) });
migrationBuilder.UpdateData(
table: "material_bases",
keyColumn: "id",
keyValue: new Guid("10000000-0000-0000-0000-000000000002"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 34, 33, 291, DateTimeKind.Utc).AddTicks(9871), new DateTime(2026, 4, 26, 18, 34, 33, 291, DateTimeKind.Utc).AddTicks(9871) });
migrationBuilder.UpdateData(
table: "material_bases",
keyColumn: "id",
keyValue: new Guid("10000000-0000-0000-0000-000000000003"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 34, 33, 291, DateTimeKind.Utc).AddTicks(9881), new DateTime(2026, 4, 26, 18, 34, 33, 291, DateTimeKind.Utc).AddTicks(9881) });
migrationBuilder.UpdateData(
table: "material_bases",
keyColumn: "id",
keyValue: new Guid("10000000-0000-0000-0000-000000000004"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 34, 33, 291, DateTimeKind.Utc).AddTicks(9888), new DateTime(2026, 4, 26, 18, 34, 33, 291, DateTimeKind.Utc).AddTicks(9888) });
migrationBuilder.UpdateData(
table: "material_bases",
keyColumn: "id",
keyValue: new Guid("10000000-0000-0000-0000-000000000005"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 34, 33, 291, DateTimeKind.Utc).AddTicks(9895), new DateTime(2026, 4, 26, 18, 34, 33, 291, DateTimeKind.Utc).AddTicks(9895) });
migrationBuilder.UpdateData(
table: "material_bases",
keyColumn: "id",
keyValue: new Guid("10000000-0000-0000-0000-000000000006"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 34, 33, 291, DateTimeKind.Utc).AddTicks(9901), new DateTime(2026, 4, 26, 18, 34, 33, 291, DateTimeKind.Utc).AddTicks(9902) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000001"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(90), new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(90) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000002"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(251), new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(251) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000003"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(259), new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(259) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000004"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(266), new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(266) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000005"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(272), new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(272) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000006"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(278), new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(278) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000007"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(285), new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(285) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000008"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(291), new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(291) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000009"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(297), new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(298) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000010"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(304), new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(304) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000011"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(310), new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(310) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000012"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(316), new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(317) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000013"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(323), new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(323) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000014"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(329), new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(329) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000015"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(336), new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(336) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000001"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(482), new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(482) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000002"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(805), new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(806) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000003"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(815), new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(815) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000004"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(821), new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(821) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000005"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(828), new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(828) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000006"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(834), new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(834) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000007"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(840), new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(840) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000008"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(847), new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(847) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000009"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(853), new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(853) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000010"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(859), new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(860) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000011"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(866), new DateTime(2026, 4, 26, 18, 34, 33, 292, DateTimeKind.Utc).AddTicks(866) });
migrationBuilder.CreateIndex(
name: "ix_filament_usages_print_job_id",
table: "filament_usages",
column: "print_job_id");
migrationBuilder.CreateIndex(
name: "ix_filament_usages_printer_id",
table: "filament_usages",
column: "printer_id");
migrationBuilder.CreateIndex(
name: "ix_filament_usages_recorded_at",
table: "filament_usages",
column: "recorded_at");
migrationBuilder.CreateIndex(
name: "ix_filament_usages_spool_id",
table: "filament_usages",
column: "spool_id");
migrationBuilder.CreateIndex(
name: "ix_filament_usages_spool_id_recorded_at",
table: "filament_usages",
columns: new[] { "spool_id", "recorded_at" });
}
/// <inheritdoc />
protected override void Down(MigrationBuilder migrationBuilder)
{
migrationBuilder.DropTable(
name: "filament_usages");
migrationBuilder.UpdateData(
table: "material_bases",
keyColumn: "id",
keyValue: new Guid("10000000-0000-0000-0000-000000000001"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1096), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1096) });
migrationBuilder.UpdateData(
table: "material_bases",
keyColumn: "id",
keyValue: new Guid("10000000-0000-0000-0000-000000000002"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1620), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1620) });
migrationBuilder.UpdateData(
table: "material_bases",
keyColumn: "id",
keyValue: new Guid("10000000-0000-0000-0000-000000000003"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1630), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1630) });
migrationBuilder.UpdateData(
table: "material_bases",
keyColumn: "id",
keyValue: new Guid("10000000-0000-0000-0000-000000000004"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1638), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1638) });
migrationBuilder.UpdateData(
table: "material_bases",
keyColumn: "id",
keyValue: new Guid("10000000-0000-0000-0000-000000000005"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1645), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1645) });
migrationBuilder.UpdateData(
table: "material_bases",
keyColumn: "id",
keyValue: new Guid("10000000-0000-0000-0000-000000000006"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1651), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1652) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000001"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1850), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1850) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000002"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2041), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2041) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000003"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2049), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2049) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000004"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2055), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2056) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000005"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2062), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2062) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000006"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2068), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2068) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000007"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2075), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2075) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000008"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2081), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2081) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000009"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2100), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2100) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000010"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2107), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2107) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000011"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2113), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2113) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000012"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2120), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2120) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000013"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2126), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2126) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000014"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2132), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2133) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000015"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2139), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2139) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000001"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2304), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2304) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000002"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2463), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2463) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000003"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2471), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2471) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000004"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2477), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2478) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000005"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2484), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2484) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000006"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2490), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2491) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000007"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2497), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2497) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000008"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2503), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2503) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000009"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2510), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2510) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000010"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2516), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2516) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000011"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2522), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2523) });
}
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,534 @@
using System;
using Microsoft.EntityFrameworkCore.Migrations;
#nullable disable
namespace Extrudex.Infrastructure.Data.Migrations
{
/// <inheritdoc />
public partial class AddUsageLogTable : Migration
{
/// <inheritdoc />
protected override void Up(MigrationBuilder migrationBuilder)
{
migrationBuilder.CreateTable(
name: "usage_logs",
columns: table => new
{
id = table.Column<Guid>(type: "uuid", nullable: false),
spool_id = table.Column<Guid>(type: "uuid", nullable: false),
printer_id = table.Column<Guid>(type: "uuid", nullable: true),
print_job_id = table.Column<Guid>(type: "uuid", nullable: true),
grams_used = table.Column<decimal>(type: "numeric(10,2)", precision: 10, scale: 2, nullable: false),
mm_extruded = table.Column<decimal>(type: "numeric(12,2)", precision: 12, scale: 2, nullable: true),
usage_timestamp = table.Column<DateTime>(type: "timestamp with time zone", nullable: false),
data_source = table.Column<string>(type: "character varying(50)", maxLength: 50, nullable: false),
notes = table.Column<string>(type: "character varying(2000)", maxLength: 2000, nullable: true),
created_at = table.Column<DateTime>(type: "timestamp with time zone", nullable: false, defaultValueSql: "now() at time zone 'utc'"),
updated_at = table.Column<DateTime>(type: "timestamp with time zone", nullable: false, defaultValueSql: "now() at time zone 'utc'")
},
constraints: table =>
{
table.PrimaryKey("PK_usage_logs", x => x.id);
table.ForeignKey(
name: "fk_usage_logs_print_job",
column: x => x.print_job_id,
principalTable: "print_jobs",
principalColumn: "id",
onDelete: ReferentialAction.SetNull);
table.ForeignKey(
name: "fk_usage_logs_printer",
column: x => x.printer_id,
principalTable: "printers",
principalColumn: "id",
onDelete: ReferentialAction.SetNull);
table.ForeignKey(
name: "fk_usage_logs_spool",
column: x => x.spool_id,
principalTable: "spools",
principalColumn: "id",
onDelete: ReferentialAction.Restrict);
});
migrationBuilder.UpdateData(
table: "material_bases",
keyColumn: "id",
keyValue: new Guid("10000000-0000-0000-0000-000000000001"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(6535), new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(6535) });
migrationBuilder.UpdateData(
table: "material_bases",
keyColumn: "id",
keyValue: new Guid("10000000-0000-0000-0000-000000000002"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7016), new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7016) });
migrationBuilder.UpdateData(
table: "material_bases",
keyColumn: "id",
keyValue: new Guid("10000000-0000-0000-0000-000000000003"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7027), new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7028) });
migrationBuilder.UpdateData(
table: "material_bases",
keyColumn: "id",
keyValue: new Guid("10000000-0000-0000-0000-000000000004"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7034), new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7035) });
migrationBuilder.UpdateData(
table: "material_bases",
keyColumn: "id",
keyValue: new Guid("10000000-0000-0000-0000-000000000005"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7042), new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7042) });
migrationBuilder.UpdateData(
table: "material_bases",
keyColumn: "id",
keyValue: new Guid("10000000-0000-0000-0000-000000000006"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7049), new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7049) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000001"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7291), new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7292) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000002"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7453), new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7453) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000003"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7461), new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7461) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000004"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7468), new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7468) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000005"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7474), new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7474) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000006"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7480), new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7481) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000007"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7487), new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7487) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000008"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7493), new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7493) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000009"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7500), new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7500) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000010"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7507), new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7507) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000011"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7513), new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7513) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000012"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7519), new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7520) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000013"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7526), new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7526) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000014"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7532), new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7532) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000015"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7538), new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7539) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000001"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7690), new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7690) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000002"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7838), new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7838) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000003"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7846), new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7846) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000004"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7853), new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7853) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000005"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7859), new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7859) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000006"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7865), new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7866) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000007"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7872), new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7872) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000008"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7878), new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7879) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000009"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7885), new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7885) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000010"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7891), new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7891) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000011"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7898), new DateTime(2026, 4, 26, 18, 43, 28, 895, DateTimeKind.Utc).AddTicks(7898) });
migrationBuilder.CreateIndex(
name: "ix_usage_logs_data_source",
table: "usage_logs",
column: "data_source");
migrationBuilder.CreateIndex(
name: "ix_usage_logs_print_job_id",
table: "usage_logs",
column: "print_job_id");
migrationBuilder.CreateIndex(
name: "ix_usage_logs_printer_id",
table: "usage_logs",
column: "printer_id");
migrationBuilder.CreateIndex(
name: "ix_usage_logs_spool_id",
table: "usage_logs",
column: "spool_id");
migrationBuilder.CreateIndex(
name: "ix_usage_logs_usage_timestamp",
table: "usage_logs",
column: "usage_timestamp");
}
/// <inheritdoc />
protected override void Down(MigrationBuilder migrationBuilder)
{
migrationBuilder.DropTable(
name: "usage_logs");
migrationBuilder.UpdateData(
table: "material_bases",
keyColumn: "id",
keyValue: new Guid("10000000-0000-0000-0000-000000000001"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1096), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1096) });
migrationBuilder.UpdateData(
table: "material_bases",
keyColumn: "id",
keyValue: new Guid("10000000-0000-0000-0000-000000000002"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1620), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1620) });
migrationBuilder.UpdateData(
table: "material_bases",
keyColumn: "id",
keyValue: new Guid("10000000-0000-0000-0000-000000000003"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1630), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1630) });
migrationBuilder.UpdateData(
table: "material_bases",
keyColumn: "id",
keyValue: new Guid("10000000-0000-0000-0000-000000000004"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1638), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1638) });
migrationBuilder.UpdateData(
table: "material_bases",
keyColumn: "id",
keyValue: new Guid("10000000-0000-0000-0000-000000000005"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1645), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1645) });
migrationBuilder.UpdateData(
table: "material_bases",
keyColumn: "id",
keyValue: new Guid("10000000-0000-0000-0000-000000000006"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1651), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1652) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000001"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1850), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(1850) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000002"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2041), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2041) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000003"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2049), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2049) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000004"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2055), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2056) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000005"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2062), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2062) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000006"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2068), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2068) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000007"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2075), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2075) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000008"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2081), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2081) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000009"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2100), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2100) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000010"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2107), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2107) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000011"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2113), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2113) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000012"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2120), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2120) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000013"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2126), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2126) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000014"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2132), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2133) });
migrationBuilder.UpdateData(
table: "material_finishes",
keyColumn: "id",
keyValue: new Guid("20000000-0000-0000-0000-000000000015"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2139), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2139) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000001"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2304), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2304) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000002"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2463), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2463) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000003"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2471), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2471) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000004"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2477), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2478) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000005"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2484), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2484) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000006"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2490), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2491) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000007"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2497), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2497) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000008"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2503), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2503) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000009"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2510), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2510) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000010"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2516), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2516) });
migrationBuilder.UpdateData(
table: "material_modifiers",
keyColumn: "id",
keyValue: new Guid("30000000-0000-0000-0000-000000000011"),
columns: new[] { "created_at", "updated_at" },
values: new object[] { new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2522), new DateTime(2026, 4, 26, 13, 14, 18, 745, DateTimeKind.Utc).AddTicks(2523) });
}
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,158 @@
using Extrudex.Domain.Interfaces;
using Extrudex.Infrastructure.Data;
using Microsoft.EntityFrameworkCore;
using Microsoft.Extensions.Logging;
namespace Extrudex.Infrastructure.Services;
/// <summary>
/// Calculates the cost of goods sold (COGS) per print job using the spool's
/// purchase price and the print job's derived grams consumed.
///
/// Formula:
/// cost_per_gram = purchase_price / weight_total_grams
/// cost_per_print = grams_derived × cost_per_gram
///
/// Handles missing data gracefully — if the spool has no purchase price or
/// weight recorded, the result includes warnings and null cost fields
/// instead of throwing exceptions.
/// </summary>
public class CostPerPrintService : ICostPerPrintService
{
private readonly ExtrudexDbContext _dbContext;
private readonly ILogger<CostPerPrintService> _logger;
/// <summary>
/// Initializes a new instance of the <see cref="CostPerPrintService"/> class.
/// </summary>
/// <param name="dbContext">The database context for data access.</param>
/// <param name="logger">The logger for diagnostic output.</param>
public CostPerPrintService(ExtrudexDbContext dbContext, ILogger<CostPerPrintService> logger)
{
_dbContext = dbContext;
_logger = logger;
}
/// <inheritdoc />
public async Task<CostPerPrintResult> CalculateAsync(Guid printJobId, CancellationToken cancellationToken = default)
{
_logger.LogDebug("Calculating cost per print for job {PrintJobId}", printJobId);
var job = await _dbContext.PrintJobs
.Include(j => j.Spool)
.ThenInclude(s => s!.MaterialBase)
.FirstOrDefaultAsync(j => j.Id == printJobId, cancellationToken);
if (job is null)
{
_logger.LogWarning("Print job {PrintJobId} not found for cost calculation", printJobId);
return new CostPerPrintResult
{
PrintJobId = printJobId,
Warnings = new List<string> { $"Print job with ID '{printJobId}' not found." }
};
}
return BuildResult(job);
}
/// <inheritdoc />
public async Task<IReadOnlyList<CostPerPrintResult>> CalculateBySpoolAsync(
Guid spoolId, CancellationToken cancellationToken = default)
{
_logger.LogDebug("Calculating cost per print for all jobs on spool {SpoolId}", spoolId);
var jobs = await _dbContext.PrintJobs
.Include(j => j.Spool)
.ThenInclude(s => s!.MaterialBase)
.Where(j => j.SpoolId == spoolId)
.OrderByDescending(j => j.CreatedAt)
.ToListAsync(cancellationToken);
if (jobs.Count == 0)
{
_logger.LogDebug("No print jobs found for spool {SpoolId}", spoolId);
return Array.Empty<CostPerPrintResult>();
}
return jobs.Select(BuildResult).ToList();
}
/// <summary>
/// Builds a <see cref="CostPerPrintResult"/> from a print job entity.
/// Computes cost_per_gram and cost_per_print when all required data is available.
/// Populates warnings when data is missing or incomplete.
/// </summary>
/// <param name="job">The print job entity with Spool navigation loaded.</param>
/// <returns>A cost calculation result with breakdown and any warnings.</returns>
private CostPerPrintResult BuildResult(Domain.Entities.PrintJob job)
{
var warnings = new List<string>();
var spool = job.Spool;
// Map what we always have
var result = new CostPerPrintResult
{
PrintJobId = job.Id,
PrintName = job.PrintName,
SpoolId = job.SpoolId,
SpoolSerial = spool?.SpoolSerial ?? string.Empty,
MmExtruded = job.MmExtruded,
GramsDerived = job.GramsDerived,
};
// Guard: spool must be loaded
if (spool is null)
{
warnings.Add("Spool data is not available for this print job.");
result.Warnings = warnings;
return result;
}
// Capture purchase price
result.PurchasePrice = spool.PurchasePrice;
result.WeightTotalGrams = spool.WeightTotalGrams;
// Check for missing purchase price
if (!spool.PurchasePrice.HasValue)
{
warnings.Add(
"Spool purchase price is not recorded. Cost calculation requires a purchase price on the spool.");
}
// Check for zero or negative weight — prevents division by zero
if (spool.WeightTotalGrams <= 0)
{
warnings.Add(
"Spool total weight is zero or not recorded. Cost calculation requires a positive weight_total_grams on the spool.");
}
// Check for zero grams derived
if (job.GramsDerived <= 0)
{
warnings.Add(
"Derived grams consumed is zero. Ensure mm_extruded, filament diameter, and material density are recorded for this print job.");
}
// If all data is present and valid, compute the cost
if (spool.PurchasePrice.HasValue && spool.WeightTotalGrams > 0 && job.GramsDerived > 0)
{
var costPerGram = spool.PurchasePrice.Value / spool.WeightTotalGrams;
result.CostPerGram = Math.Round(costPerGram, 6);
result.CostPerPrint = Math.Round(job.GramsDerived * costPerGram, 4);
_logger.LogDebug(
"Cost calculated for job {PrintJobId}: {GramsDerived}g × {CostPerGram:C}/g = {CostPerPrint:C}",
job.Id, job.GramsDerived, result.CostPerGram, result.CostPerPrint);
}
else
{
_logger.LogDebug(
"Cost calculation incomplete for job {PrintJobId}: missing data (warnings: {WarningCount})",
job.Id, warnings.Count);
}
result.Warnings = warnings;
return result;
}
}

View File

@@ -0,0 +1,79 @@
using Extrudex.Domain.Entities;
using Extrudex.Domain.Interfaces;
using Extrudex.Infrastructure.Data;
using Microsoft.EntityFrameworkCore;
using Microsoft.Extensions.Logging;
namespace Extrudex.Infrastructure.Services;
/// <summary>
/// EF Corebacked implementation of the filament usage service.
/// Persists usage records to the database and provides query methods
/// for retrieving usage by print job or spool.
/// </summary>
public class FilamentUsageService : IFilamentUsageService
{
private readonly ExtrudexDbContext _dbContext;
private readonly ILogger<FilamentUsageService> _logger;
public FilamentUsageService(
ExtrudexDbContext dbContext,
ILogger<FilamentUsageService> logger)
{
_dbContext = dbContext;
_logger = logger;
}
/// <inheritdoc />
public async Task<FilamentUsage> RecordUsageAsync(
Guid printJobId,
Guid spoolId,
Guid printerId,
decimal gramsUsed,
decimal mmExtruded,
string? notes = null,
CancellationToken cancellationToken = default)
{
var usage = new FilamentUsage
{
PrintJobId = printJobId,
SpoolId = spoolId,
PrinterId = printerId,
GramsUsed = gramsUsed,
MmExtruded = mmExtruded,
RecordedAt = DateTime.UtcNow,
Notes = notes
};
_dbContext.FilamentUsages.Add(usage);
await _dbContext.SaveChangesAsync(cancellationToken);
_logger.LogInformation(
"Recorded filament usage: {Grams}g / {Mm}mm for print job {JobId} on spool {SpoolId}",
gramsUsed, mmExtruded, printJobId, spoolId);
return usage;
}
/// <inheritdoc />
public async Task<IReadOnlyList<FilamentUsage>> GetByPrintJobAsync(
Guid printJobId,
CancellationToken cancellationToken = default)
{
return await _dbContext.FilamentUsages
.Where(u => u.PrintJobId == printJobId)
.OrderByDescending(u => u.RecordedAt)
.ToListAsync(cancellationToken);
}
/// <inheritdoc />
public async Task<IReadOnlyList<FilamentUsage>> GetBySpoolAsync(
Guid spoolId,
CancellationToken cancellationToken = default)
{
return await _dbContext.FilamentUsages
.Where(u => u.SpoolId == spoolId)
.OrderByDescending(u => u.RecordedAt)
.ToListAsync(cancellationToken);
}
}

View File

@@ -0,0 +1,139 @@
using Extrudex.Domain.Enums;
using Extrudex.Domain.Interfaces;
using Extrudex.Infrastructure.Data;
using Microsoft.EntityFrameworkCore;
using Microsoft.Extensions.Logging;
namespace Extrudex.Infrastructure.Configuration;
/// <summary>
/// Service that syncs filament usage data from Moonraker printers into the
/// Extrudex database. Queries all active Moonraker printers, fetches their
/// current filament usage metrics, and updates spool remaining weights and
/// print job records.
/// </summary>
public class FilamentUsageSyncService : IFilamentUsageSyncService
{
private readonly ExtrudexDbContext _dbContext;
private readonly IMoonrakerClient _moonrakerClient;
private readonly ILogger<FilamentUsageSyncService> _logger;
/// <summary>
/// Creates a new FilamentUsageSyncService.
/// </summary>
/// <param name="dbContext">The EF Core database context for persisting updates.</param>
/// <param name="moonrakerClient">The Moonraker HTTP client for fetching printer data.</param>
/// <param name="logger">Logger for diagnostic output.</param>
public FilamentUsageSyncService(
ExtrudexDbContext dbContext,
IMoonrakerClient moonrakerClient,
ILogger<FilamentUsageSyncService> logger)
{
_dbContext = dbContext;
_moonrakerClient = moonrakerClient;
_logger = logger;
}
/// <inheritdoc />
public async Task<int> SyncAllAsync(CancellationToken cancellationToken = default)
{
_logger.LogInformation("Starting filament usage sync cycle");
var printers = await _dbContext.Printers
.Where(p => p.IsActive && p.ConnectionType == ConnectionType.Moonraker)
.Include(p => p.AmsUnits)
.ThenInclude(u => u.Slots)
.ThenInclude(s => s.Spool)
.ToListAsync(cancellationToken);
if (printers.Count == 0)
{
_logger.LogInformation("No active Moonraker printers found — skipping sync");
return 0;
}
_logger.LogInformation("Found {PrinterCount} active Moonraker printer(s) to sync", printers.Count);
var syncedCount = 0;
foreach (var printer in printers)
{
try
{
var usageData = await _moonrakerClient.GetFilamentUsageAsync(
printer.HostnameOrIp,
printer.Port,
printer.ApiKey,
cancellationToken);
if (usageData.Count == 0)
{
_logger.LogWarning(
"No usage data returned from printer {PrinterName} ({Host}:{Port})",
printer.Name, printer.HostnameOrIp, printer.Port);
continue;
}
// Update spool remaining weights from AMS data
UpdateSpoolWeights(printer, usageData);
// Mark printer as seen and idle (reachable = idle, not printing)
printer.LastSeenAt = DateTime.UtcNow;
printer.Status = PrinterStatus.Idle;
syncedCount++;
_logger.LogInformation(
"Successfully synced filament usage from printer {PrinterName}",
printer.Name);
}
catch (Exception ex)
{
_logger.LogError(ex,
"Error syncing filament usage from printer {PrinterName} ({Host}:{Port})",
printer.Name, printer.HostnameOrIp, printer.Port);
}
}
await _dbContext.SaveChangesAsync(cancellationToken);
_logger.LogInformation(
"Filament usage sync cycle complete — {SyncedCount}/{TotalCount} printers synced",
syncedCount, printers.Count);
return syncedCount;
}
/// <summary>
/// Updates spool remaining weights based on usage data received from Moonraker.
/// For printers with AMS units, updates the remaining weight on each slot's spool.
/// </summary>
private void UpdateSpoolWeights(
Domain.Entities.Printer printer,
Dictionary<string, decimal> usageData)
{
// Update AMS slot remaining weights if available
foreach (var amsUnit in printer.AmsUnits)
{
foreach (var slot in amsUnit.Slots)
{
if (slot.Spool != null && slot.RemainingWeightG.HasValue)
{
// Sync the AMS-reported remaining weight to the spool
slot.Spool.WeightRemainingGrams = slot.RemainingWeightG.Value;
_logger.LogDebug(
"Updated spool {SpoolSerial} remaining weight to {Weight}g",
slot.Spool.SpoolSerial, slot.RemainingWeightG.Value);
}
}
}
// If usage data contains extruded mm, log it for observability
if (usageData.TryGetValue("mm_extruded", out var mmExtruded) && mmExtruded > 0)
{
_logger.LogInformation(
"Printer {PrinterName} reports {MmExtruded}mm filament extruded in latest job",
printer.Name, mmExtruded);
}
}
}

View File

@@ -0,0 +1,95 @@
using Extrudex.Domain.Interfaces;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.Logging;
namespace Extrudex.Infrastructure.Services;
/// <summary>
/// Detects low-stock filament spools by comparing the remaining weight percentage
/// against a configurable threshold. The threshold can be set via:
/// 1. EXTRUDEX_LOW_STOCK_THRESHOLD env var (highest priority, e.g. "25")
/// 2. FilamentAlerts:LowStockThresholdPercent in appsettings.json
/// 3. Default: 20% (a standard spool is "low" when ≤20% remains)
/// </summary>
public class LowStockDetector : ILowStockDetector
{
private readonly ILogger<LowStockDetector> _logger;
/// <summary>
/// The percentage threshold below which a spool is considered low stock.
/// For example, 20 means a spool is "low" when ≤20% of its filament remains.
/// </summary>
public decimal LowStockThresholdPercent { get; }
/// <summary>
/// Initializes a new instance of the <see cref="LowStockDetector"/> class.
/// Reads the low-stock threshold from configuration with env var override support.
/// </summary>
/// <param name="configuration">Application configuration for threshold settings.</param>
/// <param name="logger">Logger for diagnostic output.</param>
public LowStockDetector(IConfiguration configuration, ILogger<LowStockDetector> logger)
{
_logger = logger;
// Priority: env var > appsettings > default (20%)
var envThreshold = Environment.GetEnvironmentVariable("EXTRUDEX_LOW_STOCK_THRESHOLD");
var configThreshold = configuration.GetValue<decimal?>("FilamentAlerts:LowStockThresholdPercent");
if (!string.IsNullOrEmpty(envThreshold) && decimal.TryParse(envThreshold, out var parsedEnv))
{
LowStockThresholdPercent = Math.Clamp(parsedEnv, 0m, 100m);
_logger.LogInformation(
"Low-stock threshold set from env var EXTRUDEX_LOW_STOCK_THRESHOLD: {Threshold}%",
LowStockThresholdPercent);
}
else if (configThreshold.HasValue)
{
LowStockThresholdPercent = Math.Clamp(configThreshold.Value, 0m, 100m);
_logger.LogInformation(
"Low-stock threshold set from config FilamentAlerts:LowStockThresholdPercent: {Threshold}%",
LowStockThresholdPercent);
}
else
{
LowStockThresholdPercent = 20m;
_logger.LogInformation(
"Low-stock threshold using default: {Threshold}%", LowStockThresholdPercent);
}
}
/// <inheritdoc />
public bool IsLowStock(decimal weightRemainingGrams, decimal weightTotalGrams)
{
if (weightTotalGrams <= 0m)
{
_logger.LogDebug(
"Spool with total weight {Total}g cannot be evaluated for low stock — treating as not low",
weightTotalGrams);
return false;
}
var remainingPercent = GetRemainingWeightPercent(weightRemainingGrams, weightTotalGrams);
var isLow = remainingPercent <= LowStockThresholdPercent;
if (isLow)
{
_logger.LogDebug(
"Spool is LOW STOCK: {Remaining}g / {Total}g = {Percent:F1}% (threshold: {Threshold}%)",
weightRemainingGrams, weightTotalGrams, remainingPercent, LowStockThresholdPercent);
}
return isLow;
}
/// <inheritdoc />
public decimal GetRemainingWeightPercent(decimal weightRemainingGrams, decimal weightTotalGrams)
{
if (weightTotalGrams <= 0m)
return 0m;
return Math.Round(
(weightRemainingGrams / weightTotalGrams) * 100m,
1,
MidpointRounding.AwayFromZero);
}
}

View File

@@ -0,0 +1,447 @@
using System.Net.Http.Json;
using System.Text.Json;
using Extrudex.Domain.DTOs.Moonraker;
using Extrudex.Domain.Interfaces;
using Microsoft.Extensions.Logging;
namespace Extrudex.Infrastructure.Services;
/// <summary>
/// HTTP client for communicating with Moonraker REST API endpoints
/// on Klipper-based printers (e.g., Elegoo Centauri Carbon).
/// Provides strongly-typed methods for server discovery, printer status,
/// print job history, and real-time telemetry.
/// </summary>
public class MoonrakerClient : IMoonrakerClient
{
private readonly HttpClient _httpClient;
private readonly ILogger<MoonrakerClient> _logger;
/// <summary>
/// Creates a new MoonrakerClient with the configured HTTP client and logger.
/// </summary>
/// <param name="httpClient">The HTTP client for making requests to Moonraker endpoints.</param>
/// <param name="logger">Logger for diagnostic output.</param>
public MoonrakerClient(HttpClient httpClient, ILogger<MoonrakerClient> logger)
{
_httpClient = httpClient;
_logger = logger;
}
/// <inheritdoc />
public async Task<MoonrakerServerInfo?> GetServerInfoAsync(
string hostnameOrIp,
int port,
string? apiKey,
CancellationToken cancellationToken = default)
{
var baseUrl = BuildBaseUrl(hostnameOrIp, port);
try
{
using var request = CreateRequest(HttpMethod.Get, $"{baseUrl}/server/info", apiKey);
using var response = await _httpClient.SendAsync(request, cancellationToken);
response.EnsureSuccessStatusCode();
var json = await response.Content.ReadFromJsonAsync<JsonElement>(cancellationToken: cancellationToken);
var serverInfo = new MoonrakerServerInfo();
if (json.TryGetProperty("result", out var result))
{
if (result.TryGetProperty("hostname", out var hostname))
serverInfo.Hostname = hostname.GetString() ?? string.Empty;
if (result.TryGetProperty("software_version", out var version))
serverInfo.SoftwareVersion = version.GetString() ?? string.Empty;
if (result.TryGetProperty("cpu_info", out var cpuInfo))
serverInfo.CpuInfo = cpuInfo.GetString() ?? string.Empty;
if (result.TryGetProperty("klippy_connected", out var klippyConnected))
serverInfo.KlippyConnected = klippyConnected.GetBoolean();
if (result.TryGetProperty("klippy_state", out var klippyState))
serverInfo.KlippyState = klippyState.GetString() ?? string.Empty;
if (result.TryGetProperty("api_key_required", out var apiKeyRequired))
serverInfo.ApiKeyRequired = apiKeyRequired.GetBoolean();
if (result.TryGetProperty("plugins", out var plugins))
serverInfo.Plugins = plugins.EnumerateArray()
.Select(p => p.GetString() ?? string.Empty)
.Where(s => !string.IsNullOrEmpty(s))
.ToList();
}
_logger.LogDebug(
"Retrieved server info from Moonraker at {Host}:{Port} — version {Version}, klippy {State}",
hostnameOrIp, port, serverInfo.SoftwareVersion, serverInfo.KlippyState);
return serverInfo;
}
catch (HttpRequestException ex)
{
_logger.LogWarning(ex,
"Failed to retrieve server info from Moonraker at {Host}:{Port}",
hostnameOrIp, port);
return null;
}
catch (JsonException ex)
{
_logger.LogWarning(ex,
"Failed to parse Moonraker server info response from {Host}:{Port}",
hostnameOrIp, port);
return null;
}
}
/// <inheritdoc />
public async Task<bool> IsReachableAsync(
string hostnameOrIp,
int port,
string? apiKey,
CancellationToken cancellationToken = default)
{
var serverInfo = await GetServerInfoAsync(hostnameOrIp, port, apiKey, cancellationToken);
return serverInfo is not null;
}
/// <inheritdoc />
public async Task<MoonrakerPrinterInfo?> GetPrinterInfoAsync(
string hostnameOrIp,
int port,
string? apiKey,
CancellationToken cancellationToken = default)
{
var baseUrl = BuildBaseUrl(hostnameOrIp, port);
try
{
using var request = CreateRequest(HttpMethod.Get, $"{baseUrl}/printer/info", apiKey);
using var response = await _httpClient.SendAsync(request, cancellationToken);
response.EnsureSuccessStatusCode();
var json = await response.Content.ReadFromJsonAsync<JsonElement>(cancellationToken: cancellationToken);
var printerInfo = new MoonrakerPrinterInfo();
if (json.TryGetProperty("result", out var result))
{
if (result.TryGetProperty("state", out var state))
printerInfo.State = state.GetString() ?? string.Empty;
if (result.TryGetProperty("state_message", out var stateMessage))
printerInfo.StateMessage = stateMessage.GetString() ?? string.Empty;
if (result.TryGetProperty("klippy_ready", out var klippyReady))
printerInfo.KlippyReady = klippyReady.GetBoolean();
}
_logger.LogDebug(
"Retrieved printer info from Moonraker at {Host}:{Port} — state: {State}",
hostnameOrIp, port, printerInfo.State);
return printerInfo;
}
catch (HttpRequestException ex)
{
_logger.LogWarning(ex,
"Failed to retrieve printer info from Moonraker at {Host}:{Port}",
hostnameOrIp, port);
return null;
}
catch (JsonException ex)
{
_logger.LogWarning(ex,
"Failed to parse Moonraker printer info response from {Host}:{Port}",
hostnameOrIp, port);
return null;
}
}
/// <inheritdoc />
public async Task<MoonrakerHistoryResponse> GetPrintHistoryAsync(
string hostnameOrIp,
int port,
string? apiKey,
int limit = 50,
CancellationToken cancellationToken = default)
{
var baseUrl = BuildBaseUrl(hostnameOrIp, port);
var historyResponse = new MoonrakerHistoryResponse();
try
{
using var request = CreateRequest(
HttpMethod.Get,
$"{baseUrl}/server/history/items?limit={limit}",
apiKey);
using var response = await _httpClient.SendAsync(request, cancellationToken);
response.EnsureSuccessStatusCode();
var json = await response.Content.ReadFromJsonAsync<JsonElement>(cancellationToken: cancellationToken);
if (json.TryGetProperty("result", out var result))
{
if (result.TryGetProperty("count", out var count))
historyResponse.TotalCount = count.GetInt32();
if (result.TryGetProperty("items", out var items))
{
foreach (var item in items.EnumerateArray())
{
var job = MapPrintJob(item);
historyResponse.Items.Add(job);
}
}
}
_logger.LogDebug(
"Retrieved {JobCount} print history items from Moonraker at {Host}:{Port}",
historyResponse.Items.Count, hostnameOrIp, port);
}
catch (HttpRequestException ex)
{
_logger.LogWarning(ex,
"Failed to retrieve print history from Moonraker at {Host}:{Port}",
hostnameOrIp, port);
}
catch (JsonException ex)
{
_logger.LogWarning(ex,
"Failed to parse Moonraker history response from {Host}:{Port}",
hostnameOrIp, port);
}
return historyResponse;
}
/// <inheritdoc />
public async Task<MoonrakerPrintStats?> GetPrintStatsAsync(
string hostnameOrIp,
int port,
string? apiKey,
CancellationToken cancellationToken = default)
{
var baseUrl = BuildBaseUrl(hostnameOrIp, port);
try
{
using var request = CreateRequest(
HttpMethod.Get,
$"{baseUrl}/printer/objects/query?print_stats",
apiKey);
using var response = await _httpClient.SendAsync(request, cancellationToken);
response.EnsureSuccessStatusCode();
var json = await response.Content.ReadFromJsonAsync<JsonElement>(cancellationToken: cancellationToken);
if (json.TryGetProperty("result", out var result)
&& result.TryGetProperty("status", out var status)
&& status.TryGetProperty("print_stats", out var printStats))
{
var stats = MapPrintStats(printStats);
_logger.LogDebug(
"Retrieved print stats from Moonraker at {Host}:{Port} — state: {State}, filament: {FilamentMm}mm",
hostnameOrIp, port, stats.State, stats.FilamentUsedMm);
return stats;
}
_logger.LogWarning(
"Moonraker print_stats not found in response from {Host}:{Port}",
hostnameOrIp, port);
return null;
}
catch (HttpRequestException ex)
{
_logger.LogWarning(ex,
"Failed to retrieve print stats from Moonraker at {Host}:{Port}",
hostnameOrIp, port);
return null;
}
catch (JsonException ex)
{
_logger.LogWarning(ex,
"Failed to parse Moonraker print stats response from {Host}:{Port}",
hostnameOrIp, port);
return null;
}
}
/// <inheritdoc />
public async Task<MoonrakerDisplayStatus?> GetDisplayStatusAsync(
string hostnameOrIp,
int port,
string? apiKey,
CancellationToken cancellationToken = default)
{
var baseUrl = BuildBaseUrl(hostnameOrIp, port);
try
{
using var request = CreateRequest(
HttpMethod.Get,
$"{baseUrl}/printer/objects/query?display_status",
apiKey);
using var response = await _httpClient.SendAsync(request, cancellationToken);
response.EnsureSuccessStatusCode();
var json = await response.Content.ReadFromJsonAsync<JsonElement>(cancellationToken: cancellationToken);
if (json.TryGetProperty("result", out var result)
&& result.TryGetProperty("status", out var status)
&& status.TryGetProperty("display_status", out var displayStatus))
{
var ds = new MoonrakerDisplayStatus();
if (displayStatus.TryGetProperty("progress", out var progress))
ds.Progress = progress.GetDecimal();
if (displayStatus.TryGetProperty("message", out var message))
ds.Message = message.GetString() ?? string.Empty;
_logger.LogDebug(
"Retrieved display status from Moonraker at {Host}:{Port} — progress: {Progress:P0}",
hostnameOrIp, port, ds.Progress);
return ds;
}
_logger.LogWarning(
"Moonraker display_status not found in response from {Host}:{Port}",
hostnameOrIp, port);
return null;
}
catch (HttpRequestException ex)
{
_logger.LogWarning(ex,
"Failed to retrieve display status from Moonraker at {Host}:{Port}",
hostnameOrIp, port);
return null;
}
catch (JsonException ex)
{
_logger.LogWarning(ex,
"Failed to parse Moonraker display status response from {Host}:{Port}",
hostnameOrIp, port);
return null;
}
}
/// <inheritdoc />
public async Task<Dictionary<string, decimal>> GetFilamentUsageAsync(
string hostnameOrIp,
int port,
string? apiKey,
CancellationToken cancellationToken = default)
{
// Delegate to the typed GetPrintHistoryAsync and extract metrics
var history = await GetPrintHistoryAsync(hostnameOrIp, port, apiKey, limit: 1, cancellationToken);
var result = new Dictionary<string, decimal>();
if (history.Items.Count > 0)
{
var latestJob = history.Items[0];
result["mm_extruded"] = latestJob.FilamentUsedMm;
result["print_duration_seconds"] = latestJob.PrintDurationSeconds;
}
_logger.LogDebug(
"Retrieved filament usage from Moonraker at {Host}:{Port}: {MetricCount} metrics",
hostnameOrIp, port, result.Count);
return result;
}
/// <summary>
/// Builds the base URL for Moonraker API calls from hostname and port.
/// </summary>
private static string BuildBaseUrl(string hostnameOrIp, int port)
{
return $"http://{hostnameOrIp}:{port}";
}
/// <summary>
/// Creates an HttpRequestMessage with the optional API key header.
/// </summary>
private static HttpRequestMessage CreateRequest(HttpMethod method, string url, string? apiKey)
{
var request = new HttpRequestMessage(method, url);
if (!string.IsNullOrEmpty(apiKey))
{
request.Headers.Add("X-Api-Key", apiKey);
}
return request;
}
/// <summary>
/// Maps a JSON element representing a Moonraker print job history item
/// to a <see cref="MoonrakerPrintJob"/> DTO.
/// </summary>
private static MoonrakerPrintJob MapPrintJob(JsonElement item)
{
var job = new MoonrakerPrintJob();
if (item.TryGetProperty("job_id", out var jobId))
job.JobId = jobId.GetString() ?? string.Empty;
if (item.TryGetProperty("filename", out var filename))
job.Filename = filename.GetString() ?? string.Empty;
if (item.TryGetProperty("status", out var status))
job.Status = status.GetString() ?? string.Empty;
if (item.TryGetProperty("filament_used", out var filamentUsed))
job.FilamentUsedMm = filamentUsed.GetDecimal();
if (item.TryGetProperty("print_duration", out var printDuration))
job.PrintDurationSeconds = printDuration.GetDecimal();
if (item.TryGetProperty("total_duration", out var totalDuration))
job.TotalDurationSeconds = totalDuration.GetDecimal();
if (item.TryGetProperty("start_time", out var startTime) && startTime.ValueKind != JsonValueKind.Null)
{
if (startTime.TryGetInt64(out var startTimeSeconds))
job.StartTime = DateTimeOffset.FromUnixTimeSeconds(startTimeSeconds).UtcDateTime;
}
if (item.TryGetProperty("end_time", out var endTime) && endTime.ValueKind != JsonValueKind.Null)
{
if (endTime.TryGetInt64(out var endTimeSeconds))
job.EndTime = DateTimeOffset.FromUnixTimeSeconds(endTimeSeconds).UtcDateTime;
}
if (item.TryGetProperty("metadata", out var metadata) && metadata.ValueKind == JsonValueKind.Object)
{
foreach (var prop in metadata.EnumerateObject())
{
object value = prop.Value.ValueKind switch
{
JsonValueKind.String => prop.Value.GetString() ?? string.Empty,
JsonValueKind.Number => prop.Value.GetDecimal(),
JsonValueKind.True => true,
JsonValueKind.False => false,
_ => prop.Value.ToString() ?? string.Empty
};
job.Metadata[prop.Name] = value;
}
}
return job;
}
/// <summary>
/// Maps a JSON element representing Moonraker print_stats
/// to a <see cref="MoonrakerPrintStats"/> DTO.
/// </summary>
private static MoonrakerPrintStats MapPrintStats(JsonElement printStats)
{
var stats = new MoonrakerPrintStats();
if (printStats.TryGetProperty("state", out var state))
stats.State = state.GetString() ?? string.Empty;
if (printStats.TryGetProperty("filament_used", out var filamentUsed))
stats.FilamentUsedMm = filamentUsed.GetDecimal();
if (printStats.TryGetProperty("print_duration", out var printDuration))
stats.PrintDurationSeconds = printDuration.GetDecimal();
if (printStats.TryGetProperty("filename", out var filename) && filename.ValueKind != JsonValueKind.Null)
stats.Filename = filename.GetString();
if (printStats.TryGetProperty("message", out var message) && message.ValueKind != JsonValueKind.Null)
stats.Message = message.GetString();
return stats;
}
}

View File

@@ -0,0 +1,320 @@
using Extrudex.Domain.DTOs.Moonraker;
using Extrudex.Domain.Entities;
using Extrudex.Domain.Enums;
using Extrudex.Domain.Interfaces;
using Extrudex.Infrastructure.Configuration;
using Extrudex.Infrastructure.Data;
using Microsoft.EntityFrameworkCore;
using Microsoft.Extensions.Logging;
namespace Extrudex.Infrastructure.Services;
/// <summary>
/// Service that syncs Moonraker printer status and print job history into the
/// Extrudex database. Queries all active Moonraker printers, fetches their
/// current operational state, and maps completed print jobs to PrintJob and
/// FilamentUsage entities with derived gram calculations.
/// </summary>
public class MoonrakerPrinterSyncService : IMoonrakerPrinterSyncService
{
private readonly ExtrudexDbContext _dbContext;
private readonly IMoonrakerClient _moonrakerClient;
private readonly ILogger<MoonrakerPrinterSyncService> _logger;
/// <summary>
/// Creates a new MoonrakerPrinterSyncService.
/// </summary>
/// <param name="dbContext">The EF Core database context for persisting updates.</param>
/// <param name="moonrakerClient">The Moonraker HTTP client for fetching printer data.</param>
/// <param name="logger">Logger for diagnostic output.</param>
public MoonrakerPrinterSyncService(
ExtrudexDbContext dbContext,
IMoonrakerClient moonrakerClient,
ILogger<MoonrakerPrinterSyncService> logger)
{
_dbContext = dbContext;
_moonrakerClient = moonrakerClient;
_logger = logger;
}
/// <inheritdoc />
public async Task<int> SyncAllAsync(CancellationToken cancellationToken = default)
{
_logger.LogInformation("Starting Moonraker printer sync cycle");
var printers = await _dbContext.Printers
.Where(p => p.IsActive && p.ConnectionType == ConnectionType.Moonraker)
.Include(p => p.AmsUnits)
.ThenInclude(u => u.Slots)
.ThenInclude(s => s.Spool)
.ThenInclude(s => s.MaterialBase)
.Include(p => p.PrintJobs)
.ToListAsync(cancellationToken);
if (printers.Count == 0)
{
_logger.LogInformation("No active Moonraker printers found — skipping sync");
return 0;
}
_logger.LogInformation("Found {PrinterCount} active Moonraker printer(s) to sync", printers.Count);
var syncedCount = 0;
foreach (var printer in printers)
{
try
{
await SyncPrinterAsync(printer, cancellationToken);
syncedCount++;
}
catch (Exception ex) when (ex is not OperationCanceledException)
{
_logger.LogError(ex,
"Error syncing printer {PrinterName} ({Host}:{Port})",
printer.Name, printer.HostnameOrIp, printer.Port);
// Mark printer as offline if we can't reach it
printer.Status = PrinterStatus.Offline;
}
}
await _dbContext.SaveChangesAsync(cancellationToken);
_logger.LogInformation(
"Moonraker printer sync cycle complete — {SyncedCount}/{TotalCount} printers synced",
syncedCount, printers.Count);
return syncedCount;
}
/// <summary>
/// Syncs a single Moonraker printer: updates its status, fetches print history,
/// and maps new print jobs to database entities.
/// </summary>
private async Task SyncPrinterAsync(Printer printer, CancellationToken cancellationToken)
{
// Step 1: Fetch printer status
var printerInfo = await _moonrakerClient.GetPrinterInfoAsync(
printer.HostnameOrIp, printer.Port, printer.ApiKey, cancellationToken);
var printStats = await _moonrakerClient.GetPrintStatsAsync(
printer.HostnameOrIp, printer.Port, printer.ApiKey, cancellationToken);
// Step 2: Update printer status
UpdatePrinterStatus(printer, printerInfo, printStats);
printer.LastSeenAt = DateTime.UtcNow;
_logger.LogDebug(
"Printer {PrinterName} status updated to {Status}",
printer.Name, printer.Status);
// Step 3: Fetch and map print job history
var history = await _moonrakerClient.GetPrintHistoryAsync(
printer.HostnameOrIp, printer.Port, printer.ApiKey,
limit: 25,
cancellationToken);
if (history.Items.Count == 0)
{
_logger.LogDebug("No print history returned for printer {PrinterName}", printer.Name);
return;
}
var newJobsCount = await MapPrintJobsAsync(printer, history.Items, cancellationToken);
if (newJobsCount > 0)
{
_logger.LogInformation(
"Mapped {NewJobsCount} new print job(s) from printer {PrinterName}",
newJobsCount, printer.Name);
}
}
/// <summary>
/// Updates the printer's operational status based on Moonraker telemetry.
/// Maps Klipper/Moonraker state strings to the PrinterStatus enum.
/// </summary>
private void UpdatePrinterStatus(
Printer printer,
MoonrakerPrinterInfo? printerInfo,
MoonrakerPrintStats? printStats)
{
// Prefer print_stats state — it's the most authoritative
if (printStats != null)
{
printer.Status = printStats.State.ToLowerInvariant() switch
{
"printing" => PrinterStatus.Printing,
"paused" => PrinterStatus.Paused,
"complete" => PrinterStatus.Idle,
"standby" => PrinterStatus.Idle,
"cancelled" => PrinterStatus.Idle,
"error" => PrinterStatus.Error,
_ => PrinterStatus.Idle
};
return;
}
// Fall back to printer_info state
if (printerInfo != null)
{
printer.Status = printerInfo.State.ToLowerInvariant() switch
{
"ready" => PrinterStatus.Idle,
"startup" => PrinterStatus.Idle,
"shutdown" => PrinterStatus.Offline,
"error" => PrinterStatus.Error,
"cancelled" => PrinterStatus.Idle,
_ => printer.Status // Preserve existing status if unknown
};
}
}
/// <summary>
/// Maps Moonraker print job history items to Extrudex PrintJob and FilamentUsage entities.
/// Only creates records for jobs not already tracked (by Moonraker JobId stored in GcodeFilePath).
/// </summary>
private async Task<int> MapPrintJobsAsync(
Printer printer,
List<MoonrakerPrintJob> historyItems,
CancellationToken cancellationToken)
{
// Build a set of already-tracked Moonraker JobIds for this printer
// We store the Moonraker JobId in the GcodeFilePath field with a "moonraker:" prefix
var trackedJobIds = await _dbContext.PrintJobs
.Where(pj => pj.PrinterId == printer.Id && pj.GcodeFilePath != null && pj.GcodeFilePath.StartsWith("moonraker:"))
.Select(pj => pj.GcodeFilePath!)
.ToListAsync(cancellationToken);
var trackedIdSet = new HashSet<string>(trackedJobIds);
var newJobsCount = 0;
// Find the default spool for this printer (first active spool in AMS, or first active spool overall)
var defaultSpool = FindDefaultSpool(printer);
foreach (var moonrakerJob in historyItems)
{
var jobIdKey = $"moonraker:{moonrakerJob.JobId}";
if (trackedIdSet.Contains(jobIdKey))
{
continue; // Already tracked — skip
}
// Only map completed, cancelled, or errored jobs (not in_progress)
// In-progress jobs will be captured on the next cycle once they finish
if (moonrakerJob.Status == "in_progress")
{
continue;
}
// Map Moonraker job status to JobStatus enum
var jobStatus = moonrakerJob.Status.ToLowerInvariant() switch
{
"completed" => JobStatus.Completed,
"cancelled" => JobStatus.Cancelled,
"error" => JobStatus.Failed,
_ => JobStatus.Completed
};
// Calculate derived grams if we have a spool and filament data
decimal gramsDerived = 0m;
decimal filamentDiameterMm = 1.75m;
decimal materialDensity = 1.24m; // PLA default
if (defaultSpool != null)
{
filamentDiameterMm = defaultSpool.FilamentDiameterMm;
materialDensity = defaultSpool.MaterialBase.DensityGperCm3;
gramsDerived = CalculateGrams(moonrakerJob.FilamentUsedMm, filamentDiameterMm, materialDensity);
}
else if (moonrakerJob.FilamentUsedMm > 0)
{
gramsDerived = CalculateGrams(moonrakerJob.FilamentUsedMm, 1.75m, 1.24m);
_logger.LogWarning(
"No default spool found for printer {PrinterName} — using PLA defaults for grams derivation on job {JobId}",
printer.Name, moonrakerJob.JobId);
}
var printJob = new PrintJob
{
PrinterId = printer.Id,
SpoolId = defaultSpool?.Id ?? Guid.Empty,
PrintName = moonrakerJob.Filename,
GcodeFilePath = jobIdKey,
MmExtruded = moonrakerJob.FilamentUsedMm,
GramsDerived = gramsDerived,
StartedAt = moonrakerJob.StartTime,
CompletedAt = moonrakerJob.EndTime,
Status = jobStatus,
DataSource = DataSource.Moonraker,
FilamentDiameterAtPrintMm = filamentDiameterMm,
MaterialDensityAtPrint = materialDensity,
Notes = $"Auto-imported from Moonraker (JobId: {moonrakerJob.JobId})"
};
_dbContext.PrintJobs.Add(printJob);
// Create a FilamentUsage record if filament was consumed
if (moonrakerJob.FilamentUsedMm > 0 && defaultSpool != null)
{
var usage = new FilamentUsage
{
PrintJob = printJob,
SpoolId = defaultSpool.Id,
PrinterId = printer.Id,
GramsUsed = gramsDerived,
MmExtruded = moonrakerJob.FilamentUsedMm,
RecordedAt = DateTime.UtcNow,
Notes = $"Auto-imported from Moonraker history (JobId: {moonrakerJob.JobId})"
};
_dbContext.FilamentUsages.Add(usage);
}
newJobsCount++;
trackedIdSet.Add(jobIdKey); // Prevent duplicates within this batch
}
return newJobsCount;
}
/// <summary>
/// Finds the default spool for a printer. Returns the first spool loaded
/// in an AMS slot, or null if no spool is available.
/// </summary>
private static Spool? FindDefaultSpool(Printer printer)
{
// Prefer the first active spool in an AMS slot
foreach (var amsUnit in printer.AmsUnits)
{
foreach (var slot in amsUnit.Slots)
{
if (slot.Spool != null && slot.Spool.IsActive && !slot.Spool.IsArchived)
{
return slot.Spool;
}
}
}
return null;
}
/// <summary>
/// Calculates derived grams from millimeters extruded using the standard formula:
/// grams = mm_extruded × cross_section_area × material_density
/// where cross_section_area = π × (diameter / 2)²
/// </summary>
private static decimal CalculateGrams(decimal mmExtruded, decimal diameterMm, decimal densityGperCm3)
{
if (mmExtruded <= 0) return 0m;
var radiusCm = (double)diameterMm / 2.0 / 10.0; // mm to cm
var crossSectionAreaCm2 = Math.PI * radiusCm * radiusCm;
var mmToCm = (double)mmExtruded / 10.0;
var grams = mmToCm * crossSectionAreaCm2 * (double)densityGperCm3;
return (decimal)grams;
}
}

View File

@@ -0,0 +1,390 @@
using Extrudex.Domain.DTOs.Moonraker;
using Extrudex.Domain.Entities;
using Extrudex.Domain.Enums;
using Extrudex.Domain.Interfaces;
using Extrudex.Infrastructure.Data;
using Microsoft.EntityFrameworkCore;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
namespace Extrudex.Infrastructure.Services;
/// <summary>
/// Configuration options for the Moonraker usage polling service.
/// </summary>
public class MoonrakerPollerOptions
{
/// <summary>
/// How often to poll each Moonraker printer for filament usage data.
/// Default: 30 seconds.
/// </summary>
public TimeSpan PollInterval { get; set; } = TimeSpan.FromSeconds(30);
/// <summary>
/// Timeout for individual Moonraker HTTP requests.
/// Default: 10 seconds.
/// </summary>
public TimeSpan RequestTimeout { get; set; } = TimeSpan.FromSeconds(10);
/// <summary>
/// Whether the polling service is enabled. Default: true.
/// Set to false to disable polling (e.g., in development or testing).
/// </summary>
public bool Enabled { get; set; } = true;
}
/// <summary>
/// Background service that periodically polls Moonraker-connected printers
/// for filament usage data. When a print job is detected as complete,
/// the usage data is persisted to the FilamentUsage table via
/// <see cref="IFilamentUsageService"/>.
///
/// <para>Polling logic:</para>
/// <list type="number">
/// <item>Query the database for all active printers with ConnectionType == Moonraker.</item>
/// <item>For each printer, call <see cref="IMoonrakerClient.GetPrintStatsAsync"/> for live data
/// and <see cref="IMoonrakerClient.GetPrintHistoryAsync"/> for completed job history.</item>
/// <item>If usage data is available and the print state is "complete",
/// create or update a FilamentUsage record.</item>
/// <item>If the printer is unreachable or returns malformed data, log a warning
/// and continue to the next printer (no crash).</item>
/// </list>
///
/// <para>Error handling:</para>
/// <list type="bullet">
/// <item>API unreachable: logged as warning, poller continues for other printers.</item>
/// <item>Malformed response: logged as warning, poller continues.</item>
/// <item>Database errors: logged as error, poller continues.</item>
/// </list>
/// </summary>
public class MoonrakerUsagePoller : BackgroundService
{
private readonly IServiceScopeFactory _scopeFactory;
private readonly ILogger<MoonrakerUsagePoller> _logger;
private readonly MoonrakerPollerOptions _options;
/// <summary>
/// Tracks which Moonraker print jobs have already been recorded,
/// keyed by "printerId:gcodeFileName" to avoid duplicate recording.
/// </summary>
private readonly HashSet<string> _recordedJobs = new();
public MoonrakerUsagePoller(
IServiceScopeFactory scopeFactory,
ILogger<MoonrakerUsagePoller> logger,
IOptions<MoonrakerPollerOptions> options)
{
_scopeFactory = scopeFactory;
_logger = logger;
_options = options.Value;
}
protected override async Task ExecuteAsync(CancellationToken stoppingToken)
{
if (!_options.Enabled)
{
_logger.LogInformation("Moonraker usage poller is disabled via configuration.");
return;
}
_logger.LogInformation(
"Moonraker usage poller starting. Poll interval: {Interval}",
_options.PollInterval);
while (!stoppingToken.IsCancellationRequested)
{
try
{
await PollAllPrintersAsync(stoppingToken);
}
catch (Exception ex)
{
_logger.LogError(ex,
"Unexpected error in Moonraker usage poller cycle. Continuing.");
}
await Task.Delay(_options.PollInterval, stoppingToken);
}
_logger.LogInformation("Moonraker usage poller stopping.");
}
private async Task PollAllPrintersAsync(CancellationToken cancellationToken)
{
using var scope = _scopeFactory.CreateScope();
var dbContext = scope.ServiceProvider.GetRequiredService<ExtrudexDbContext>();
var moonrakerClient = scope.ServiceProvider.GetRequiredService<IMoonrakerClient>();
var usageService = scope.ServiceProvider.GetRequiredService<IFilamentUsageService>();
var printers = await dbContext.Printers
.Where(p => p.IsActive && p.ConnectionType == ConnectionType.Moonraker)
.ToListAsync(cancellationToken);
if (printers.Count == 0)
{
_logger.LogDebug("No active Moonraker printers found.");
return;
}
_logger.LogDebug("Polling {Count} Moonraker printer(s).", printers.Count);
foreach (var printer in printers)
{
await PollPrinterAsync(
printer, moonrakerClient, usageService, dbContext, cancellationToken);
}
}
private async Task PollPrinterAsync(
Printer printer,
IMoonrakerClient moonrakerClient,
IFilamentUsageService usageService,
ExtrudexDbContext dbContext,
CancellationToken cancellationToken)
{
_logger.LogDebug(
"Polling Moonraker printer {PrinterName} ({Host}:{Port})",
printer.Name, printer.HostnameOrIp, printer.Port);
try
{
var printStats = await moonrakerClient.GetPrintStatsAsync(
printer.HostnameOrIp,
printer.Port,
printer.ApiKey,
cancellationToken);
if (printStats is null)
{
_logger.LogDebug(
"No print stats available from printer {PrinterName}.", printer.Name);
return;
}
printer.LastSeenAt = DateTime.UtcNow;
await dbContext.SaveChangesAsync(cancellationToken);
_logger.LogDebug(
"Printer {PrinterName}: state={State}, filament={Mm}mm, file={File}",
printer.Name, printStats.State, printStats.FilamentUsedMm, printStats.Filename);
decimal mmExtruded = printStats.FilamentUsedMm;
if (mmExtruded <= 0)
{
_logger.LogDebug(
"Printer {PrinterName} has no filament usage to record.", printer.Name);
return;
}
if (!IsCompleteState(printStats.State))
{
_logger.LogDebug(
"Printer {PrinterName} print state '{State}' is not complete; skipping.",
printer.Name, printStats.State);
return;
}
string gcodeFileName = printStats.Filename ?? $"unknown-{Guid.NewGuid():N}";
var deduplicationKey = $"{printer.Id}:{gcodeFileName}";
if (_recordedJobs.Contains(deduplicationKey))
{
_logger.LogDebug(
"Printer {PrinterName} job '{File}' already recorded; skipping.",
printer.Name, gcodeFileName);
return;
}
DateTime? startedAt = null;
DateTime? completedAt = null;
try
{
var history = await moonrakerClient.GetPrintHistoryAsync(
printer.HostnameOrIp, printer.Port, printer.ApiKey,
limit: 1, cancellationToken);
if (history.Items.Count > 0)
{
var latestJob = history.Items[0];
startedAt = latestJob.StartTime;
completedAt = latestJob.EndTime;
}
}
catch (Exception ex)
{
_logger.LogDebug(ex,
"Could not fetch history for printer {PrinterName}; proceeding with stats only.",
printer.Name);
}
var printJob = await FindOrCreatePrintJobAsync(
dbContext, printer, mmExtruded, gcodeFileName,
startedAt, completedAt, cancellationToken);
if (printJob is null)
{
_logger.LogWarning(
"Could not find or create print job for printer {PrinterName}. No active spool found.",
printer.Name);
return;
}
var spool = await dbContext.Spools.FindAsync(
new object[] { printJob.SpoolId }, cancellationToken);
var gramsUsed = CalculateGramsUsed(mmExtruded, spool);
await usageService.RecordUsageAsync(
printJobId: printJob.Id,
spoolId: printJob.SpoolId,
printerId: printer.Id,
gramsUsed: gramsUsed,
mmExtruded: mmExtruded,
notes: $"Moonraker auto-recorded: {gcodeFileName}",
cancellationToken: cancellationToken);
_recordedJobs.Add(deduplicationKey);
_logger.LogInformation(
"Recorded Moonraker usage for printer {PrinterName}: {Mm}mm / {Grams}g, job '{File}'",
printer.Name, mmExtruded, gramsUsed, gcodeFileName);
}
catch (HttpRequestException ex)
{
_logger.LogWarning(ex,
"Moonraker API unreachable for printer {PrinterName} ({Host}:{Port}). Will retry next cycle.",
printer.Name, printer.HostnameOrIp, printer.Port);
}
catch (TaskCanceledException) when (cancellationToken.IsCancellationRequested)
{
throw;
}
catch (TaskCanceledException ex)
{
_logger.LogWarning(ex,
"Moonraker request timed out for printer {PrinterName} ({Host}:{Port}).",
printer.Name, printer.HostnameOrIp, printer.Port);
}
catch (Exception ex)
{
_logger.LogError(ex,
"Unexpected error polling Moonraker printer {PrinterName}. Continuing to next printer.",
printer.Name);
}
}
private static bool IsCompleteState(string state) =>
state.Equals("complete", StringComparison.OrdinalIgnoreCase) ||
state.Equals("completed", StringComparison.OrdinalIgnoreCase);
private async Task<PrintJob?> FindOrCreatePrintJobAsync(
ExtrudexDbContext dbContext,
Printer printer,
decimal mmExtruded,
string gcodeFileName,
DateTime? startedAt,
DateTime? completedAt,
CancellationToken cancellationToken)
{
if (!string.IsNullOrEmpty(gcodeFileName))
{
var existingJob = await dbContext.PrintJobs
.Where(j => j.PrinterId == printer.Id &&
j.GcodeFilePath == gcodeFileName &&
j.DataSource == DataSource.Moonraker &&
j.Status != JobStatus.Cancelled)
.OrderByDescending(j => j.CreatedAt)
.FirstOrDefaultAsync(cancellationToken);
if (existingJob is not null)
{
existingJob.MmExtruded = mmExtruded;
existingJob.GramsDerived = CalculateGramsUsed(
mmExtruded,
await dbContext.Spools.FindAsync(
new object[] { existingJob.SpoolId }, cancellationToken));
existingJob.Status = JobStatus.Completed;
existingJob.CompletedAt = completedAt ?? DateTime.UtcNow;
existingJob.StartedAt ??= startedAt;
await dbContext.SaveChangesAsync(cancellationToken);
return existingJob;
}
}
var spool = await FindActiveSpoolForPrinterAsync(dbContext, printer, cancellationToken);
if (spool is null) return null;
var gramsDerived = CalculateGramsUsed(mmExtruded, spool);
var newJob = new PrintJob
{
PrinterId = printer.Id,
SpoolId = spool.Id,
PrintName = gcodeFileName ?? "Moonraker Print",
GcodeFilePath = gcodeFileName,
MmExtruded = mmExtruded,
GramsDerived = gramsDerived,
FilamentDiameterAtPrintMm = spool.FilamentDiameterMm,
MaterialDensityAtPrint = GetMaterialDensity(spool),
DataSource = DataSource.Moonraker,
Status = JobStatus.Completed,
StartedAt = startedAt ?? DateTime.UtcNow,
CompletedAt = completedAt ?? DateTime.UtcNow,
Notes = "Auto-created by Moonraker usage poller"
};
dbContext.PrintJobs.Add(newJob);
await dbContext.SaveChangesAsync(cancellationToken);
return newJob;
}
private static async Task<Spool?> FindActiveSpoolForPrinterAsync(
ExtrudexDbContext dbContext,
Printer printer,
CancellationToken cancellationToken)
{
var amsSpool = await dbContext.AmsSlots
.Include(s => s.Spool)
.ThenInclude(s => s!.MaterialBase)
.Include(s => s.AmsUnit)
.Where(s => s.AmsUnit.PrinterId == printer.Id && s.Spool != null && s.Spool.IsActive)
.Select(s => s.Spool)
.FirstOrDefaultAsync(cancellationToken);
if (amsSpool is not null) return amsSpool;
return await dbContext.Spools
.Include(s => s.MaterialBase)
.Where(s => s.IsActive)
.OrderByDescending(s => s.WeightRemainingGrams)
.FirstOrDefaultAsync(cancellationToken);
}
private static decimal CalculateGramsUsed(decimal mmExtruded, Spool? spool)
{
if (spool is null) return 0m;
var diameterMm = spool.FilamentDiameterMm;
var densityGcm3 = GetMaterialDensity(spool);
var radiusMm = diameterMm / 2m;
var crossSectionArea = Math.PI * (double)radiusMm * (double)radiusMm;
var volumeMm3 = (double)mmExtruded * crossSectionArea;
var volumeCm3 = volumeMm3 / 1000.0;
var grams = volumeCm3 * (double)densityGcm3;
return Math.Round((decimal)grams, 2);
}
private static decimal GetMaterialDensity(Spool? spool)
{
return spool?.MaterialBase?.Name?.ToUpperInvariant() switch
{
"PLA" => 1.24m,
"PETG" => 1.27m,
"ABS" => 1.04m,
"ASA" => 1.07m,
"TPU" => 1.21m,
"NYLON" or "PA" => 1.13m,
"PC" => 1.20m,
_ => 1.24m
};
}
}

View File

@@ -0,0 +1,81 @@
using Extrudex.Domain.Entities;
using Extrudex.Domain.Enums;
using Extrudex.Domain.Interfaces;
using Extrudex.Infrastructure.Data;
using Microsoft.EntityFrameworkCore;
namespace Extrudex.Infrastructure.Services;
/// <summary>
/// Implementation of <see cref="IUsageLogService"/> that persists usage entries
/// to the usage_logs table via EF Core.
/// </summary>
public class UsageLogService : IUsageLogService
{
private readonly ExtrudexDbContext _dbContext;
/// <summary>
/// Initializes a new instance of the <see cref="UsageLogService"/> class.
/// </summary>
/// <param name="dbContext">The EF Core database context for data persistence.</param>
public UsageLogService(ExtrudexDbContext dbContext)
{
_dbContext = dbContext;
}
/// <inheritdoc/>
public async Task<UsageLog> RecordUsageAsync(
Guid spoolId,
decimal gramsUsed,
DataSource dataSource,
Guid? printerId = null,
Guid? printJobId = null,
decimal? mmExtruded = null,
DateTime? usageTimestamp = null,
string? notes = null)
{
var entry = new UsageLog
{
SpoolId = spoolId,
GramsUsed = gramsUsed,
DataSource = dataSource,
PrinterId = printerId,
PrintJobId = printJobId,
MmExtruded = mmExtruded,
UsageTimestamp = usageTimestamp ?? DateTime.UtcNow,
Notes = notes
};
_dbContext.UsageLogs.Add(entry);
await _dbContext.SaveChangesAsync();
return entry;
}
/// <inheritdoc/>
public async Task<IEnumerable<UsageLog>> GetBySpoolAsync(Guid spoolId, CancellationToken cancellationToken = default)
{
return await _dbContext.UsageLogs
.Where(u => u.SpoolId == spoolId)
.OrderByDescending(u => u.UsageTimestamp)
.ToListAsync(cancellationToken);
}
/// <inheritdoc/>
public async Task<IEnumerable<UsageLog>> GetByPrinterAsync(Guid printerId, CancellationToken cancellationToken = default)
{
return await _dbContext.UsageLogs
.Where(u => u.PrinterId == printerId)
.OrderByDescending(u => u.UsageTimestamp)
.ToListAsync(cancellationToken);
}
/// <inheritdoc/>
public async Task<IEnumerable<UsageLog>> GetByPrintJobAsync(Guid printJobId, CancellationToken cancellationToken = default)
{
return await _dbContext.UsageLogs
.Where(u => u.PrintJobId == printJobId)
.OrderByDescending(u => u.UsageTimestamp)
.ToListAsync(cancellationToken);
}
}

View File

@@ -1,6 +1,9 @@
using System.Reflection;
using Extrudex.API.Filters;
using Extrudex.API.Hubs;
using Extrudex.API.Jobs;
using Extrudex.Domain.Interfaces;
using Extrudex.Infrastructure.Configuration;
using Extrudex.Infrastructure.Data;
using Extrudex.Infrastructure.Services;
using FluentValidation;
@@ -23,7 +26,10 @@ builder.Services.AddDbContext<ExtrudexDbContext>(options =>
options.UseNpgsql(connectionString));
// ── API Services ───────────────────────────────────────────
builder.Services.AddControllers();
builder.Services.AddControllers(options =>
{
options.Filters.AddService<FluentValidationFilter>();
});
builder.Services.AddEndpointsApiExplorer();
builder.Services.AddSwaggerGen(c =>
{
@@ -46,10 +52,31 @@ builder.Services.AddSwaggerGen(c =>
// ── QR Code Generation ──────────────────────────────────────
builder.Services.AddSingleton<IQrCodeService, QrCodeService>();
// ── Cost Per Print Calculation ─────────────────────────────
builder.Services.AddScoped<ICostPerPrintService, CostPerPrintService>();
// ── Low Stock Detection ────────────────────────────────────
builder.Services.AddSingleton<ILowStockDetector, LowStockDetector>();
// ── Usage Logging ───────────────────────────────────────────
builder.Services.AddScoped<IUsageLogService, UsageLogService>();
// ── Filament Usage Service ──────────────────────────────────
builder.Services.AddScoped<IFilamentUsageService, FilamentUsageService>();
// ── Moonraker Usage Poller (Background Service) ─────────────
builder.Services.Configure<MoonrakerPollerOptions>(
builder.Configuration.GetSection("MoonrakerPoller"));
builder.Services.AddHostedService<MoonrakerUsagePoller>();
// ── FluentValidation ──────────────────────────────────────
// Registers all validators from the API assembly into DI.
builder.Services.AddValidatorsFromAssembly(Assembly.GetExecutingAssembly());
// Register the FluentValidation action filter so validators run automatically
// on all API controller actions before the action executes.
builder.Services.AddScoped<FluentValidationFilter>();
// ── CORS (kiosk + remote browser) ─────────────────────────
// AllowAnyOrigin disallows credentials by spec; this is fine for
// REST API calls. SignalR WebSockets negotiate without credentials
@@ -69,6 +96,26 @@ builder.Services.AddCors(options =>
// ── SignalR (real-time printer updates) ────────────────────
builder.Services.AddSignalR();
// ── Filament Usage Sync (Background Job) ──────────────────
builder.Services.Configure<FilamentUsageSyncOptions>(
builder.Configuration.GetSection(FilamentUsageSyncOptions.SectionName));
builder.Services.AddHttpClient<IMoonrakerClient, MoonrakerClient>(client =>
{
client.DefaultRequestHeaders.Add("User-Agent", "Extrudex/1.0");
});
builder.Services.AddScoped<IFilamentUsageSyncService, FilamentUsageSyncService>();
builder.Services.AddHostedService<FilamentUsageSyncJob>();
// ── Moonraker Printer Sync (Background Service) ──────────
builder.Services.Configure<MoonrakerPrinterSyncOptions>(
builder.Configuration.GetSection(MoonrakerPrinterSyncOptions.SectionName));
builder.Services.AddScoped<IMoonrakerPrinterSyncService, MoonrakerPrinterSyncService>();
builder.Services.AddHostedService<MoonrakerPrinterSyncJob>();
// ── Health Checks ───────────────────────────────────────────
builder.Services.AddHealthChecks()
.AddNpgSql(connectionString);
var app = builder.Build();
// ── Middleware ──────────────────────────────────────────────
@@ -85,6 +132,9 @@ app.MapControllers();
// ── Hub Endpoints ───────────────────────────────────────────
app.MapHub<PrinterHub>("/hubs/printer");
// ── Health Check Endpoint ──────────────────────────────────
app.MapHealthChecks("/health");
app.Run();
// Helper: builds a connection string from individual env vars.

View File

@@ -8,5 +8,10 @@
},
"ConnectionStrings": {
"ExtrudexDb": "Host=localhost;Port=5432;Database=extrudex_dev;Username=extrudex;Password=changeme"
},
"FilamentUsageSync": {
"PollingInterval": "00:01:00",
"RequestTimeout": "00:00:30",
"Enabled": true
}
}

View File

@@ -9,5 +9,25 @@
"AllowedHosts": "*",
"ConnectionStrings": {
"ExtrudexDb": "Host=localhost;Port=5432;Database=extrudex;Username=extrudex;Password=changeme"
},
"FilamentUsageSync": {
"PollingInterval": "00:05:00",
"RequestTimeout": "00:00:30",
"Enabled": true
},
"MoonrakerPrinterSync": {
"PollingInterval": "00:01:00",
"RequestTimeout": "00:00:15",
"Enabled": true,
"HistoryBatchSize": 25
},
"FilamentAlerts": {
"LowStockThresholdPercent": 20
},
"MoonrakerPoller": {
"Enabled": true,
"PollInterval": "00:00:30",
"RequestTimeout": "00:00:10"
}
}
}

View File

@@ -0,0 +1,90 @@
package main
import (
"context"
"log/slog"
"net/http"
"os"
"os/signal"
"syscall"
"time"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/config"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/db"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/router"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/sse"
)
func main() {
// Setup structured logging
slog.SetDefault(slog.New(slog.NewTextHandler(os.Stdout, &slog.HandlerOptions{
Level: slog.LevelInfo,
})))
// Load configuration
cfg, err := config.Load()
if err != nil {
slog.Error("failed to load config", "error", err)
os.Exit(1)
}
slog.Info("config loaded", "port", cfg.Port, "cors_origin", cfg.CorsOrigin)
// Connect to database
dbPool, err := db.NewPool(cfg.DatabaseURL)
if err != nil {
slog.Error("failed to connect to database", "error", err)
os.Exit(1)
}
defer db.ClosePool(dbPool)
slog.Info("database connected")
// Create SSE broadcaster and start it
sseBC := sse.NewBroadcaster(128)
sseBC.Start()
defer sseBC.Stop()
slog.Info("sse broadcaster started")
// Create router
r := router.New(cfg, dbPool, sseBC)
// Create HTTP server
// WriteTimeout is 0 for SSE support — the Chi middleware.Timeout(60s)
// handles request-level timeouts on non-SSE routes.
server := &http.Server{
Addr: ":" + cfg.Port,
Handler: r,
ReadTimeout: 15 * time.Second,
WriteTimeout: 0, // disabled for SSE long-lived connections
IdleTimeout: 60 * time.Second,
}
// Start server in goroutine
go func() {
slog.Info("server starting", "addr", server.Addr)
if err := server.ListenAndServe(); err != nil && err != http.ErrServerClosed {
slog.Error("server error", "error", err)
os.Exit(1)
}
}()
// Wait for shutdown signal
quit := make(chan os.Signal, 1)
signal.Notify(quit, syscall.SIGINT, syscall.SIGTERM)
<-quit
slog.Info("server shutting down")
// Graceful shutdown
ctx, cancel := context.WithTimeout(context.Background(), 30*time.Second)
defer cancel()
if err := server.Shutdown(ctx); err != nil {
slog.Error("server shutdown error", "error", err)
}
db.ClosePool(dbPool)
slog.Info("server stopped")
}

18
backend/go.mod Normal file
View File

@@ -0,0 +1,18 @@
module github.com/CubeCraft-Creations/Extrudex/backend
go 1.24
require (
github.com/go-chi/chi/v5 v5.2.0
github.com/jackc/pgx/v5 v5.7.4
github.com/kelseyhightower/envconfig v1.4.0
)
require (
github.com/jackc/pgpassfile v1.0.0 // indirect
github.com/jackc/pgservicefile v0.0.0-20240606120523-5a60cdf6a761 // indirect
github.com/jackc/puddle/v2 v2.2.2 // indirect
golang.org/x/crypto v0.31.0 // indirect
golang.org/x/sync v0.10.0 // indirect
golang.org/x/text v0.21.0 // indirect
)

32
backend/go.sum Normal file
View File

@@ -0,0 +1,32 @@
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/go-chi/chi/v5 v5.2.0 h1:Aj1EtB0qR2Rdo2dG4O94RIU35w2lvQSj6BRA4+qwFL0=
github.com/go-chi/chi/v5 v5.2.0/go.mod h1:DslCQbL2OYiznFReuXYUmQ2hGd1aDpCnlMNITLSKoi8=
github.com/jackc/pgpassfile v1.0.0 h1:/6Hmqy13Ss2zCq62VdNG8tM1wchn8zjSGOBJ6icpsIM=
github.com/jackc/pgpassfile v1.0.0/go.mod h1:CEx0iS5ambNFdcRtxPj5JhEz+xB6uRky5eyVu/W2HEg=
github.com/jackc/pgservicefile v0.0.0-20240606120523-5a60cdf6a761 h1:iCEnooe7UlwOQYpKFhBabPMi4aNAfoODPEFNiAnClxo=
github.com/jackc/pgservicefile v0.0.0-20240606120523-5a60cdf6a761/go.mod h1:5TJZWKEWniPve33vlWYSoGYefn3gLQRzjfDlhSJ9ZKM=
github.com/jackc/pgx/v5 v5.7.4 h1:9wKznZrhWa2QiHL+NjTSPP6yjl3451BX3imWDnokYlg=
github.com/jackc/pgx/v5 v5.7.4/go.mod h1:ncY89UGWxg82EykZUwSpUKEfccBGGYq1xjrOpsbsfGQ=
github.com/jackc/puddle/v2 v2.2.2 h1:PR8nw+E/1w0GLuRFSmiioY6UooMp6KJv0/61nB7icHo=
github.com/jackc/puddle/v2 v2.2.2/go.mod h1:vriiEXHvEE654aYKXXjOvZM39qJ0q+azkZFrfEOc3H4=
github.com/kelseyhightower/envconfig v1.4.0 h1:Im6hONhd3pLkfDFsbRgu68RDNkGF1r3dvMUtDTo2cv8=
github.com/kelseyhightower/envconfig v1.4.0/go.mod h1:cccZRl6mQpaq41TPp5QxidR+Sa3axMbJDNb//FQX6Gg=
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
github.com/stretchr/testify v1.3.0/go.mod h1:M5WIy9Dh21IEIfnGCwXGc5bZfKNJtfHm1UVUgZn+9EI=
github.com/stretchr/testify v1.7.0/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg=
github.com/stretchr/testify v1.8.1 h1:w7B6lhMri9wdJUVmEZPGGhZzrYTPvgJArz7wNPgYKsk=
github.com/stretchr/testify v1.8.1/go.mod h1:w2LPCIKwWwSfY2zedu0+kehJoqGctiVI29o6fzry7u4=
golang.org/x/crypto v0.31.0 h1:ihbySMvVjLAeSH1IbfcRTkD/iNscyz8rGzjF/E5hV6U=
golang.org/x/crypto v0.31.0/go.mod h1:kDsLvtWBEx7MV9tJOj9bnXsPbxwJQ6csT/x4KIN4Ssk=
golang.org/x/sync v0.10.0 h1:3NQrjDixjgGwUOCaF8w2+VYHv0Ve/vGYSbdkTa98gmQ=
golang.org/x/sync v0.10.0/go.mod h1:Czt+wKu1gCyEFDUtn0jG5QVvpJ6rzVqr5aXyt9drQfk=
golang.org/x/text v0.21.0 h1:zyQAAkrwaneQ066sspRyJaG9VNi/YJ1NfzcGB3hZ/qo=
golang.org/x/text v0.21.0/go.mod h1:4IBbMaMmOPCJ8SecivzSH54+73PCFmPWxNTLm+vZkEQ=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=
gopkg.in/yaml.v3 v3.0.1 h1:fxVm/GzAzEWqLHuvctI91KS9hhNmmWOoWu0XTYJS7CA=
gopkg.in/yaml.v3 v3.0.1/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=

View File

@@ -0,0 +1,24 @@
package config
import (
"fmt"
"github.com/kelseyhightower/envconfig"
)
// Config holds all application configuration loaded from environment variables.
type Config struct {
DatabaseURL string `envconfig:"database_url" required:"true"`
Port string `envconfig:"port" default:"8080"`
CorsOrigin string `envconfig:"cors_origin" default:"*"`
LogLevel string `envconfig:"log_level" default:"info"`
}
// Load reads configuration from environment variables and returns a populated Config.
func Load() (*Config, error) {
var cfg Config
if err := envconfig.Process("", &cfg); err != nil {
return nil, fmt.Errorf("failed to load config: %w", err)
}
return &cfg, nil
}

34
backend/internal/db/db.go Normal file
View File

@@ -0,0 +1,34 @@
package db
import (
"context"
"fmt"
"time"
"github.com/jackc/pgx/v5/pgxpool"
)
// NewPool creates a new pgx connection pool and verifies connectivity with a ping.
func NewPool(databaseURL string) (*pgxpool.Pool, error) {
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
defer cancel()
pool, err := pgxpool.New(ctx, databaseURL)
if err != nil {
return nil, fmt.Errorf("failed to create db pool: %w", err)
}
if err := pool.Ping(ctx); err != nil {
pool.Close()
return nil, fmt.Errorf("failed to ping db: %w", err)
}
return pool, nil
}
// ClosePool gracefully closes the connection pool.
func ClosePool(pool *pgxpool.Pool) {
if pool != nil {
pool.Close()
}
}

View File

@@ -0,0 +1,67 @@
// Package dtos defines request/response data transfer objects for the Extrudex API.
// DTOs keep HTTP serialization concerns separate from domain models.
package dtos
// ============================================================================
// Common Response Wrappers
// ============================================================================
// ListResponse wraps a paginated collection response.
type ListResponse struct {
Data any `json:"data"`
Total int `json:"total"`
Limit int `json:"limit"`
Offset int `json:"offset"`
}
// SingleResponse wraps a single-item response.
type SingleResponse struct {
Data any `json:"data"`
}
// ErrorResponse is the standard error payload for all API errors.
type ErrorResponse struct {
Error string `json:"error"`
Code int `json:"code"`
}
// ============================================================================
// Filament DTOs
// ============================================================================
// CreateFilamentRequest is the POST body for creating a new filament spool.
type CreateFilamentRequest struct {
Name string `json:"name"`
MaterialBaseID int `json:"material_base_id"`
MaterialFinishID int `json:"material_finish_id"`
MaterialModifierID *int `json:"material_modifier_id,omitempty"`
ColorHex string `json:"color_hex"`
Brand *string `json:"brand,omitempty"`
DiameterMM *float64 `json:"diameter_mm,omitempty"` // defaults to 1.75
InitialGrams int `json:"initial_grams"`
RemainingGrams int `json:"remaining_grams"`
SpoolWeightGrams *int `json:"spool_weight_grams,omitempty"`
CostUSD *float64 `json:"cost_usd,omitempty"`
LowStockThresholdGrams *int `json:"low_stock_threshold_grams,omitempty"` // defaults to 50
Notes *string `json:"notes,omitempty"`
Barcode *string `json:"barcode,omitempty"`
}
// UpdateFilamentRequest is the PUT body for partially updating a filament spool.
// All fields are optional — only non-nil fields are applied.
type UpdateFilamentRequest struct {
Name *string `json:"name,omitempty"`
MaterialBaseID *int `json:"material_base_id,omitempty"`
MaterialFinishID *int `json:"material_finish_id,omitempty"`
MaterialModifierID *int `json:"material_modifier_id,omitempty"`
ColorHex *string `json:"color_hex,omitempty"`
Brand *string `json:"brand,omitempty"`
DiameterMM *float64 `json:"diameter_mm,omitempty"`
InitialGrams *int `json:"initial_grams,omitempty"`
RemainingGrams *int `json:"remaining_grams,omitempty"`
SpoolWeightGrams *int `json:"spool_weight_grams,omitempty"`
CostUSD *float64 `json:"cost_usd,omitempty"`
LowStockThresholdGrams *int `json:"low_stock_threshold_grams,omitempty"`
Notes *string `json:"notes,omitempty"`
Barcode *string `json:"barcode,omitempty"`
}

View File

@@ -0,0 +1,273 @@
package handlers
import (
"encoding/json"
"log/slog"
"net/http"
"strconv"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/dtos"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/models"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/repositories"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/services"
"github.com/go-chi/chi/v5"
)
// FilamentHandler handles HTTP requests for filament spool CRUD operations.
type FilamentHandler struct {
service *services.FilamentService
}
// NewFilamentHandler creates a FilamentHandler with the given service.
func NewFilamentHandler(service *services.FilamentService) *FilamentHandler {
return &FilamentHandler{service: service}
}
// List handles GET /api/filaments — returns paginated, filtered spools.
func (h *FilamentHandler) List(w http.ResponseWriter, r *http.Request) {
limit, offset := parsePagination(r)
filter := repositories.FilamentFilter{
Material: r.URL.Query().Get("material"),
Finish: r.URL.Query().Get("finish"),
Color: r.URL.Query().Get("color"),
LowStock: r.URL.Query().Get("low_stock") == "true",
Limit: limit,
Offset: offset,
}
spools, total, err := h.service.List(r.Context(), filter)
if err != nil {
slog.Error("failed to list filaments", "error", err)
writeJSON(w, http.StatusInternalServerError, dtos.ErrorResponse{
Error: "internal server error",
Code: http.StatusInternalServerError,
})
return
}
writeJSON(w, http.StatusOK, dtos.ListResponse{
Data: spools,
Total: total,
Limit: limit,
Offset: offset,
})
}
// Get handles GET /api/filaments/{id} — returns a single spool.
func (h *FilamentHandler) Get(w http.ResponseWriter, r *http.Request) {
id, err := strconv.Atoi(chi.URLParam(r, "id"))
if err != nil {
writeJSON(w, http.StatusBadRequest, dtos.ErrorResponse{
Error: "invalid filament ID",
Code: http.StatusBadRequest,
})
return
}
spool, err := h.service.GetByID(r.Context(), id)
if err != nil {
slog.Error("failed to get filament", "id", id, "error", err)
writeJSON(w, http.StatusInternalServerError, dtos.ErrorResponse{
Error: "internal server error",
Code: http.StatusInternalServerError,
})
return
}
if spool == nil {
writeJSON(w, http.StatusNotFound, dtos.ErrorResponse{
Error: "filament not found",
Code: http.StatusNotFound,
})
return
}
writeJSON(w, http.StatusOK, dtos.SingleResponse{Data: spool})
}
// Create handles POST /api/filaments — creates a new filament spool.
func (h *FilamentHandler) Create(w http.ResponseWriter, r *http.Request) {
var req dtos.CreateFilamentRequest
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
writeJSON(w, http.StatusBadRequest, dtos.ErrorResponse{
Error: "invalid request body",
Code: http.StatusBadRequest,
})
return
}
// Validate required fields.
if err := services.ValidateCreateFilamentRequest(req); err != nil {
writeJSON(w, http.StatusBadRequest, dtos.ErrorResponse{
Error: "validation failed: " + err.Error(),
Code: http.StatusBadRequest,
})
return
}
// Build domain model.
spool := models.FilamentSpool{
Name: req.Name,
MaterialBaseID: req.MaterialBaseID,
MaterialFinishID: req.MaterialFinishID,
MaterialModifierID: req.MaterialModifierID,
ColorHex: req.ColorHex,
Brand: req.Brand,
DiameterMM: 1.75, // default
InitialGrams: req.InitialGrams,
RemainingGrams: req.RemainingGrams,
SpoolWeightGrams: req.SpoolWeightGrams,
CostUSD: req.CostUSD,
LowStockThresholdGrams: 50, // default
Notes: req.Notes,
Barcode: req.Barcode,
}
if req.DiameterMM != nil {
spool.DiameterMM = *req.DiameterMM
}
if req.LowStockThresholdGrams != nil {
spool.LowStockThresholdGrams = *req.LowStockThresholdGrams
}
created, err := h.service.Create(r.Context(), &spool)
if err != nil {
slog.Error("failed to create filament", "error", err)
writeJSON(w, http.StatusInternalServerError, dtos.ErrorResponse{
Error: "internal server error",
Code: http.StatusInternalServerError,
})
return
}
writeJSON(w, http.StatusCreated, dtos.SingleResponse{Data: created})
}
// Update handles PUT /api/filaments/{id} — partially updates a spool.
func (h *FilamentHandler) Update(w http.ResponseWriter, r *http.Request) {
id, err := strconv.Atoi(chi.URLParam(r, "id"))
if err != nil {
writeJSON(w, http.StatusBadRequest, dtos.ErrorResponse{
Error: "invalid filament ID",
Code: http.StatusBadRequest,
})
return
}
var req dtos.UpdateFilamentRequest
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
writeJSON(w, http.StatusBadRequest, dtos.ErrorResponse{
Error: "invalid request body",
Code: http.StatusBadRequest,
})
return
}
// Validate update fields.
if err := services.ValidateUpdateFilamentRequest(req); err != nil {
writeJSON(w, http.StatusBadRequest, dtos.ErrorResponse{
Error: "validation failed: " + err.Error(),
Code: http.StatusBadRequest,
})
return
}
// Build updates map (only non-nil fields).
updates := buildFilamentUpdates(req)
updated, err := h.service.Update(r.Context(), id, updates)
if err != nil {
slog.Error("failed to update filament", "id", id, "error", err)
writeJSON(w, http.StatusInternalServerError, dtos.ErrorResponse{
Error: "internal server error",
Code: http.StatusInternalServerError,
})
return
}
if updated == nil {
writeJSON(w, http.StatusNotFound, dtos.ErrorResponse{
Error: "filament not found",
Code: http.StatusNotFound,
})
return
}
writeJSON(w, http.StatusOK, dtos.SingleResponse{Data: updated})
}
// Delete handles DELETE /api/filaments/{id} — soft-deletes a spool.
func (h *FilamentHandler) Delete(w http.ResponseWriter, r *http.Request) {
id, err := strconv.Atoi(chi.URLParam(r, "id"))
if err != nil {
writeJSON(w, http.StatusBadRequest, dtos.ErrorResponse{
Error: "invalid filament ID",
Code: http.StatusBadRequest,
})
return
}
deleted, err := h.service.SoftDelete(r.Context(), id)
if err != nil {
slog.Error("failed to delete filament", "id", id, "error", err)
writeJSON(w, http.StatusInternalServerError, dtos.ErrorResponse{
Error: "internal server error",
Code: http.StatusInternalServerError,
})
return
}
if !deleted {
writeJSON(w, http.StatusNotFound, dtos.ErrorResponse{
Error: "filament not found",
Code: http.StatusNotFound,
})
return
}
w.WriteHeader(http.StatusNoContent)
}
// buildFilamentUpdates converts an UpdateFilamentRequest to a map of column→value.
func buildFilamentUpdates(req dtos.UpdateFilamentRequest) map[string]interface{} {
updates := make(map[string]interface{})
if req.Name != nil {
updates["name"] = *req.Name
}
if req.MaterialBaseID != nil {
updates["material_base_id"] = *req.MaterialBaseID
}
if req.MaterialFinishID != nil {
updates["material_finish_id"] = *req.MaterialFinishID
}
if req.MaterialModifierID != nil {
updates["material_modifier_id"] = *req.MaterialModifierID
}
if req.ColorHex != nil {
updates["color_hex"] = *req.ColorHex
}
if req.Brand != nil {
updates["brand"] = *req.Brand
}
if req.DiameterMM != nil {
updates["diameter_mm"] = *req.DiameterMM
}
if req.InitialGrams != nil {
updates["initial_grams"] = *req.InitialGrams
}
if req.RemainingGrams != nil {
updates["remaining_grams"] = *req.RemainingGrams
}
if req.SpoolWeightGrams != nil {
updates["spool_weight_grams"] = *req.SpoolWeightGrams
}
if req.CostUSD != nil {
updates["cost_usd"] = *req.CostUSD
}
if req.LowStockThresholdGrams != nil {
updates["low_stock_threshold_grams"] = *req.LowStockThresholdGrams
}
if req.Notes != nil {
updates["notes"] = *req.Notes
}
if req.Barcode != nil {
updates["barcode"] = *req.Barcode
}
return updates
}

View File

@@ -0,0 +1,50 @@
package handlers
import (
"context"
"encoding/json"
"log/slog"
"net/http"
"time"
"github.com/jackc/pgx/v5/pgxpool"
)
// HealthHandler provides a health check endpoint that verifies database connectivity.
type HealthHandler struct {
dbPool *pgxpool.Pool
}
// NewHealthHandler creates a new HealthHandler with the given database pool.
func NewHealthHandler(dbPool *pgxpool.Pool) *HealthHandler {
return &HealthHandler{dbPool: dbPool}
}
// ServeHTTP handles GET /health requests.
func (h *HealthHandler) ServeHTTP(w http.ResponseWriter, r *http.Request) {
ctx, cancel := context.WithTimeout(r.Context(), 5*time.Second)
defer cancel()
dbConnected := false
if h.dbPool != nil {
if err := h.dbPool.Ping(ctx); err == nil {
dbConnected = true
} else {
slog.Warn("health check db ping failed", "error", err)
}
}
resp := map[string]any{
"status": "ok",
"timestamp": time.Now().UTC().Format(time.RFC3339),
"db_connected": dbConnected,
}
w.Header().Set("Content-Type", "application/json")
if !dbConnected {
w.WriteHeader(http.StatusServiceUnavailable)
}
if err := json.NewEncoder(w).Encode(resp); err != nil {
slog.Error("failed to encode health response", "error", err)
}
}

View File

@@ -0,0 +1,51 @@
package handlers
import (
"encoding/json"
"log/slog"
"net/http"
"strconv"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/dtos"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/services"
)
// writeJSON serializes v as JSON to the response writer with the given status code.
// Logs an error if encoding fails.
func writeJSON(w http.ResponseWriter, status int, v interface{}) {
w.Header().Set("Content-Type", "application/json")
w.WriteHeader(status)
if err := json.NewEncoder(w).Encode(v); err != nil {
slog.Error("failed to encode JSON response", "error", err)
}
}
// parsePagination reads limit and offset query parameters with defaults of 20 and 0.
func parsePagination(r *http.Request) (limit, offset int) {
limit = 20
offset = 0
if l := r.URL.Query().Get("limit"); l != "" {
if parsed, err := strconv.Atoi(l); err == nil && parsed > 0 {
limit = parsed
}
}
if o := r.URL.Query().Get("offset"); o != "" {
if parsed, err := strconv.Atoi(o); err == nil && parsed >= 0 {
offset = parsed
}
}
return
}
// ValidateCreateFilamentRequest validates a CreateFilamentRequest DTO.
// Re-exports the service-layer validator for handler use.
func ValidateCreateFilamentRequest(req dtos.CreateFilamentRequest) error {
return services.ValidateCreateFilamentRequest(req)
}
// ValidateUpdateFilamentRequest validates an UpdateFilamentRequest DTO.
// Re-exports the service-layer validator for handler use.
func ValidateUpdateFilamentRequest(req dtos.UpdateFilamentRequest) error {
return services.ValidateUpdateFilamentRequest(req)
}

View File

@@ -0,0 +1,34 @@
package handlers
import (
"log/slog"
"net/http"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/dtos"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/repositories"
)
// MaterialFinishHandler handles requests for material finish lookup data.
type MaterialFinishHandler struct {
repo *repositories.MaterialFinishRepository
}
// NewMaterialFinishHandler creates a MaterialFinishHandler with the given repository.
func NewMaterialFinishHandler(repo *repositories.MaterialFinishRepository) *MaterialFinishHandler {
return &MaterialFinishHandler{repo: repo}
}
// List handles GET /api/finishes — returns all material finishes.
func (h *MaterialFinishHandler) List(w http.ResponseWriter, r *http.Request) {
finishes, err := h.repo.GetAll(r.Context())
if err != nil {
slog.Error("failed to list finishes", "error", err)
writeJSON(w, http.StatusInternalServerError, dtos.ErrorResponse{
Error: "internal server error",
Code: http.StatusInternalServerError,
})
return
}
writeJSON(w, http.StatusOK, dtos.SingleResponse{Data: finishes})
}

View File

@@ -0,0 +1,34 @@
package handlers
import (
"log/slog"
"net/http"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/dtos"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/repositories"
)
// MaterialHandler handles requests for material lookup data.
type MaterialHandler struct {
repo *repositories.MaterialRepository
}
// NewMaterialHandler creates a MaterialHandler with the given repository.
func NewMaterialHandler(repo *repositories.MaterialRepository) *MaterialHandler {
return &MaterialHandler{repo: repo}
}
// List handles GET /api/materials — returns all material bases.
func (h *MaterialHandler) List(w http.ResponseWriter, r *http.Request) {
materials, err := h.repo.GetAll(r.Context())
if err != nil {
slog.Error("failed to list materials", "error", err)
writeJSON(w, http.StatusInternalServerError, dtos.ErrorResponse{
Error: "internal server error",
Code: http.StatusInternalServerError,
})
return
}
writeJSON(w, http.StatusOK, dtos.SingleResponse{Data: materials})
}

View File

@@ -0,0 +1,34 @@
package handlers
import (
"log/slog"
"net/http"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/dtos"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/repositories"
)
// MaterialModifierHandler handles requests for material modifier lookup data.
type MaterialModifierHandler struct {
repo *repositories.MaterialModifierRepository
}
// NewMaterialModifierHandler creates a MaterialModifierHandler with the given repository.
func NewMaterialModifierHandler(repo *repositories.MaterialModifierRepository) *MaterialModifierHandler {
return &MaterialModifierHandler{repo: repo}
}
// List handles GET /api/modifiers — returns all material modifiers.
func (h *MaterialModifierHandler) List(w http.ResponseWriter, r *http.Request) {
modifiers, err := h.repo.GetAll(r.Context())
if err != nil {
slog.Error("failed to list modifiers", "error", err)
writeJSON(w, http.StatusInternalServerError, dtos.ErrorResponse{
Error: "internal server error",
Code: http.StatusInternalServerError,
})
return
}
writeJSON(w, http.StatusOK, dtos.SingleResponse{Data: modifiers})
}

View File

@@ -0,0 +1,60 @@
package handlers
import (
"log/slog"
"net/http"
"strconv"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/dtos"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/repositories"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/services"
)
// PrintJobHandler handles HTTP requests for print job operations.
type PrintJobHandler struct {
service *services.PrintJobService
}
// NewPrintJobHandler creates a PrintJobHandler with the given service.
func NewPrintJobHandler(service *services.PrintJobService) *PrintJobHandler {
return &PrintJobHandler{service: service}
}
// List handles GET /api/print-jobs — returns paginated, filtered print jobs.
func (h *PrintJobHandler) List(w http.ResponseWriter, r *http.Request) {
limit, offset := parsePagination(r)
filter := repositories.PrintJobFilter{
Status: r.URL.Query().Get("status"),
Limit: limit,
Offset: offset,
}
if pidStr := r.URL.Query().Get("printer_id"); pidStr != "" {
pid, err := strconv.Atoi(pidStr)
if err != nil {
writeJSON(w, http.StatusBadRequest, dtos.ErrorResponse{
Error: "invalid printer_id",
Code: http.StatusBadRequest,
})
return
}
filter.PrinterID = &pid
}
jobs, total, err := h.service.List(r.Context(), filter)
if err != nil {
slog.Error("failed to list print jobs", "error", err)
writeJSON(w, http.StatusInternalServerError, dtos.ErrorResponse{
Error: "internal server error",
Code: http.StatusInternalServerError,
})
return
}
writeJSON(w, http.StatusOK, dtos.ListResponse{
Data: jobs,
Total: total,
Limit: limit,
Offset: offset,
})
}

View File

@@ -0,0 +1,34 @@
package handlers
import (
"log/slog"
"net/http"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/dtos"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/services"
)
// PrinterHandler handles HTTP requests for printer listings.
type PrinterHandler struct {
service *services.PrinterService
}
// NewPrinterHandler creates a PrinterHandler with the given service.
func NewPrinterHandler(service *services.PrinterService) *PrinterHandler {
return &PrinterHandler{service: service}
}
// List handles GET /api/printers — returns all printers with printer_type info.
func (h *PrinterHandler) List(w http.ResponseWriter, r *http.Request) {
printers, err := h.service.List(r.Context())
if err != nil {
slog.Error("failed to list printers", "error", err)
writeJSON(w, http.StatusInternalServerError, dtos.ErrorResponse{
Error: "internal server error",
Code: http.StatusInternalServerError,
})
return
}
writeJSON(w, http.StatusOK, dtos.SingleResponse{Data: printers})
}

View File

@@ -0,0 +1,70 @@
package handlers
import (
"log/slog"
"net/http"
"strconv"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/dtos"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/repositories"
)
// UsageLogHandler handles HTTP requests for usage log operations.
type UsageLogHandler struct {
repo *repositories.UsageLogRepository
}
// NewUsageLogHandler creates a UsageLogHandler with the given repository.
func NewUsageLogHandler(repo *repositories.UsageLogRepository) *UsageLogHandler {
return &UsageLogHandler{repo: repo}
}
// List handles GET /api/usage-logs — returns paginated, filtered usage logs.
func (h *UsageLogHandler) List(w http.ResponseWriter, r *http.Request) {
limit, offset := parsePagination(r)
filter := repositories.UsageLogFilter{
Limit: limit,
Offset: offset,
}
if sidStr := r.URL.Query().Get("spool_id"); sidStr != "" {
sid, err := strconv.Atoi(sidStr)
if err != nil {
writeJSON(w, http.StatusBadRequest, dtos.ErrorResponse{
Error: "invalid spool_id",
Code: http.StatusBadRequest,
})
return
}
filter.SpoolID = &sid
}
if jidStr := r.URL.Query().Get("job_id"); jidStr != "" {
jid, err := strconv.Atoi(jidStr)
if err != nil {
writeJSON(w, http.StatusBadRequest, dtos.ErrorResponse{
Error: "invalid job_id",
Code: http.StatusBadRequest,
})
return
}
filter.JobID = &jid
}
logs, total, err := h.repo.GetAll(r.Context(), filter)
if err != nil {
slog.Error("failed to list usage logs", "error", err)
writeJSON(w, http.StatusInternalServerError, dtos.ErrorResponse{
Error: "internal server error",
Code: http.StatusInternalServerError,
})
return
}
writeJSON(w, http.StatusOK, dtos.ListResponse{
Data: logs,
Total: total,
Limit: limit,
Offset: offset,
})
}

View File

@@ -0,0 +1,162 @@
// Package models defines the Extrudex domain model structs.
// These map 1:1 to PostgreSQL tables with snake_case JSON serialization.
// Nullable fields use pointer types; all timestamps are time.Time.
package models
import "time"
// ============================================================================
// Lookup Tables
// ============================================================================
// PrinterType represents a printer technology category (fdm, resin, etc.).
type PrinterType struct {
ID int `json:"id"`
Name string `json:"name"`
CreatedAt time.Time `json:"created_at"`
UpdatedAt time.Time `json:"updated_at"`
}
// JobStatus represents a print job lifecycle state.
type JobStatus struct {
ID int `json:"id"`
Name string `json:"name"`
CreatedAt time.Time `json:"created_at"`
UpdatedAt time.Time `json:"updated_at"`
}
// MaterialBase represents a base material type (PLA, PETG, ABS, etc.).
// Density and temperature ranges are stored here for grams-calculation and slicing guidance.
type MaterialBase struct {
ID int `json:"id"`
Name string `json:"name"`
DensityGCm3 float64 `json:"density_g_cm3"`
ExtrusionTempMin *int `json:"extrusion_temp_min,omitempty"`
ExtrusionTempMax *int `json:"extrusion_temp_max,omitempty"`
BedTempMin *int `json:"bed_temp_min,omitempty"`
BedTempMax *int `json:"bed_temp_max,omitempty"`
CreatedAt time.Time `json:"created_at"`
UpdatedAt time.Time `json:"updated_at"`
}
// MaterialFinish represents the visual/texture finish (Basic, Silk, Matte, etc.).
type MaterialFinish struct {
ID int `json:"id"`
Name string `json:"name"`
Description *string `json:"description,omitempty"`
CreatedAt time.Time `json:"created_at"`
UpdatedAt time.Time `json:"updated_at"`
}
// MaterialModifier represents an additive property (Carbon Fiber, Wood-Filled, etc.).
type MaterialModifier struct {
ID int `json:"id"`
Name string `json:"name"`
Description *string `json:"description,omitempty"`
CreatedAt time.Time `json:"created_at"`
UpdatedAt time.Time `json:"updated_at"`
}
// ============================================================================
// Core Entity Tables
// ============================================================================
// Printer represents a 3D printer in the fleet.
type Printer struct {
ID int `json:"id"`
Name string `json:"name"`
PrinterTypeID int `json:"printer_type_id"`
PrinterType *PrinterType `json:"printer_type,omitempty"` // populated on JOIN queries
Manufacturer *string `json:"manufacturer,omitempty"`
Model *string `json:"model,omitempty"`
MoonrakerURL *string `json:"moonraker_url,omitempty"`
MoonrakerAPIKey *string `json:"moonraker_api_key,omitempty"`
MQTTBrokerHost *string `json:"mqtt_broker_host,omitempty"`
MQTTTopicPrefix *string `json:"mqtt_topic_prefix,omitempty"`
MQTTTLSEnabled bool `json:"mqtt_tls_enabled"`
IsActive bool `json:"is_active"`
CreatedAt time.Time `json:"created_at"`
UpdatedAt time.Time `json:"updated_at"`
}
// FilamentSpool represents a physical filament spool in inventory.
// material_finish_id defaults to 1 ("Basic"); material_modifier_id is optional.
// Grams are always physically measured values — grams_used is derived, not stored.
type FilamentSpool struct {
ID int `json:"id"`
Name string `json:"name"`
MaterialBaseID int `json:"material_base_id"`
MaterialBase *MaterialBase `json:"material_base,omitempty"` // JOIN
MaterialFinishID int `json:"material_finish_id"`
MaterialFinish *MaterialFinish `json:"material_finish,omitempty"` // JOIN
MaterialModifierID *int `json:"material_modifier_id,omitempty"`
MaterialModifier *MaterialModifier `json:"material_modifier,omitempty"` // JOIN
ColorHex string `json:"color_hex"`
Brand *string `json:"brand,omitempty"`
DiameterMM float64 `json:"diameter_mm"`
InitialGrams int `json:"initial_grams"`
RemainingGrams int `json:"remaining_grams"`
SpoolWeightGrams *int `json:"spool_weight_grams,omitempty"`
CostUSD *float64 `json:"cost_usd,omitempty"`
LowStockThresholdGrams int `json:"low_stock_threshold_grams"`
Notes *string `json:"notes,omitempty"`
Barcode *string `json:"barcode,omitempty"`
DeletedAt *time.Time `json:"deleted_at,omitempty"`
CreatedAt time.Time `json:"created_at"`
UpdatedAt time.Time `json:"updated_at"`
}
// PrintJob represents a single print on a specific printer.
// The filament_spool_id is a convenience reference; multi-spool jobs track usage in usage_logs.
type PrintJob struct {
ID int `json:"id"`
PrinterID int `json:"printer_id"`
Printer *Printer `json:"printer,omitempty"` // JOIN
FilamentSpoolID *int `json:"filament_spool_id,omitempty"`
FilamentSpool *FilamentSpool `json:"filament_spool,omitempty"` // JOIN
JobName string `json:"job_name"`
FileName *string `json:"file_name,omitempty"`
JobStatusID int `json:"job_status_id"`
JobStatus *JobStatus `json:"job_status,omitempty"` // JOIN
StartedAt *time.Time `json:"started_at,omitempty"`
CompletedAt *time.Time `json:"completed_at,omitempty"`
DurationSeconds *int `json:"duration_seconds,omitempty"`
EstimatedDurationSeconds *int `json:"estimated_duration_seconds,omitempty"`
TotalMMExtruded *float64 `json:"total_mm_extruded,omitempty"`
TotalGramsUsed *float64 `json:"total_grams_used,omitempty"`
TotalCostUSD *float64 `json:"total_cost_usd,omitempty"`
Notes *string `json:"notes,omitempty"`
DeletedAt *time.Time `json:"deleted_at,omitempty"`
CreatedAt time.Time `json:"created_at"`
UpdatedAt time.Time `json:"updated_at"`
}
// UsageLog records filament consumption for a specific spool during a print job.
// This is the atomic unit of filament tracking — grams are derived from mm_extruded.
type UsageLog struct {
ID int `json:"id"`
PrintJobID int `json:"print_job_id"`
PrintJob *PrintJob `json:"print_job,omitempty"` // JOIN
FilamentSpoolID int `json:"filament_spool_id"`
FilamentSpool *FilamentSpool `json:"filament_spool,omitempty"` // JOIN
MMExtruded float64 `json:"mm_extruded"`
GramsUsed float64 `json:"grams_used"`
CostUSD *float64 `json:"cost_usd,omitempty"`
LoggedAt time.Time `json:"logged_at"`
CreatedAt time.Time `json:"created_at"`
}
// ============================================================================
// Application Settings
// ============================================================================
// Setting represents a key-value application configuration entry.
// The value is stored as JSONB in PostgreSQL, allowing flexible typed config.
type Setting struct {
ID int `json:"id"`
Key string `json:"key"`
Value []byte `json:"value"` // raw JSON — marshalled/unmarshalled by caller
Description *string `json:"description,omitempty"`
CreatedAt time.Time `json:"created_at"`
UpdatedAt time.Time `json:"updated_at"`
}

View File

@@ -0,0 +1,285 @@
package repositories
import (
"context"
"fmt"
"strings"
"time"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/models"
"github.com/jackc/pgx/v5/pgxpool"
)
// FilamentRepository handles database queries for filament_spools.
type FilamentRepository struct {
pool *pgxpool.Pool
}
// NewFilamentRepository creates a FilamentRepository backed by the given pool.
func NewFilamentRepository(pool *pgxpool.Pool) *FilamentRepository {
return &FilamentRepository{pool: pool}
}
// FilamentFilter holds query parameters for listing filament spools.
type FilamentFilter struct {
Material string // filter by material_base name (case-insensitive)
Finish string // filter by material_finish name (case-insensitive)
Color string // filter by exact color_hex match
LowStock bool // if true, filter for remaining_grams <= low_stock_threshold_grams
Limit int
Offset int
}
// spoolScanFields is the common SELECT column list for filament spools with JOINs.
const spoolScanFields = `
s.id, s.name,
s.material_base_id,
COALESCE(mb.name, '') as material_base_name,
COALESCE(mb.density_g_cm3, 0) as material_base_density_g_cm3,
COALESCE(mb.extrusion_temp_min, NULL::int) as material_base_extrusion_temp_min,
COALESCE(mb.extrusion_temp_max, NULL::int) as material_base_extrusion_temp_max,
COALESCE(mb.bed_temp_min, NULL::int) as material_base_bed_temp_min,
COALESCE(mb.bed_temp_max, NULL::int) as material_base_bed_temp_max,
COALESCE(mb.created_at, s.created_at) as material_base_created_at,
COALESCE(mb.updated_at, s.created_at) as material_base_updated_at,
s.material_finish_id,
COALESCE(mf.name, '') as material_finish_name,
mf.description as material_finish_description,
COALESCE(mf.created_at, s.created_at) as material_finish_created_at,
COALESCE(mf.updated_at, s.created_at) as material_finish_updated_at,
s.material_modifier_id,
mm.name as material_modifier_name,
mm.description as material_modifier_description,
mm.created_at as material_modifier_created_at,
mm.updated_at as material_modifier_updated_at,
s.color_hex, s.brand, s.diameter_mm,
s.initial_grams, s.remaining_grams, s.spool_weight_grams,
s.cost_usd, s.low_stock_threshold_grams,
s.notes, s.barcode,
s.deleted_at, s.created_at, s.updated_at`
const spoolFromJoins = `
FROM filament_spools s
LEFT JOIN material_bases mb ON s.material_base_id = mb.id
LEFT JOIN material_finishes mf ON s.material_finish_id = mf.id
LEFT JOIN material_modifiers mm ON s.material_modifier_id = mm.id`
// scanSpoolWithJoins scans a full spool row including all JOINed tables.
func scanSpoolWithJoins(row interface{ Scan(...interface{}) error }) (models.FilamentSpool, error) {
var s models.FilamentSpool
var mb models.MaterialBase
var mf models.MaterialFinish
var mfDesc *string
var modifierID *int
var modName, modDesc *string
var modCreatedAt, modUpdatedAt *time.Time
err := row.Scan(
&s.ID, &s.Name,
&s.MaterialBaseID,
&mb.Name, &mb.DensityGCm3,
&mb.ExtrusionTempMin, &mb.ExtrusionTempMax,
&mb.BedTempMin, &mb.BedTempMax,
&mb.CreatedAt, &mb.UpdatedAt,
&s.MaterialFinishID,
&mf.Name, &mfDesc,
&mf.CreatedAt, &mf.UpdatedAt,
&modifierID,
&modName, &modDesc,
&modCreatedAt, &modUpdatedAt,
&s.ColorHex, &s.Brand, &s.DiameterMM,
&s.InitialGrams, &s.RemainingGrams, &s.SpoolWeightGrams,
&s.CostUSD, &s.LowStockThresholdGrams,
&s.Notes, &s.Barcode,
&s.DeletedAt, &s.CreatedAt, &s.UpdatedAt,
)
if err != nil {
return s, err
}
mb.ID = s.MaterialBaseID
s.MaterialBase = &mb
mf.ID = s.MaterialFinishID
if mfDesc != nil {
mf.Description = mfDesc
}
s.MaterialFinish = &mf
s.MaterialModifierID = modifierID
if modifierID != nil && modName != nil {
mm := models.MaterialModifier{
ID: *modifierID,
Name: *modName,
}
if modDesc != nil {
mm.Description = modDesc
}
if modCreatedAt != nil {
mm.CreatedAt = *modCreatedAt
}
if modUpdatedAt != nil {
mm.UpdatedAt = *modUpdatedAt
}
s.MaterialModifier = &mm
}
return s, nil
}
// GetAll returns filament spools matching the given filters, with pagination.
// Returns results, total matching count, and any error.
func (r *FilamentRepository) GetAll(ctx context.Context, filter FilamentFilter) ([]models.FilamentSpool, int, error) {
conditions := []string{"s.deleted_at IS NULL"}
args := []interface{}{}
argIdx := 1
if filter.Material != "" {
conditions = append(conditions, fmt.Sprintf("LOWER(mb.name) = LOWER($%d)", argIdx))
args = append(args, filter.Material)
argIdx++
}
if filter.Finish != "" {
conditions = append(conditions, fmt.Sprintf("LOWER(mf.name) = LOWER($%d)", argIdx))
args = append(args, filter.Finish)
argIdx++
}
if filter.Color != "" {
conditions = append(conditions, fmt.Sprintf("s.color_hex = $%d", argIdx))
args = append(args, filter.Color)
argIdx++
}
if filter.LowStock {
conditions = append(conditions, "s.remaining_grams <= s.low_stock_threshold_grams")
}
whereClause := ""
if len(conditions) > 0 {
whereClause = "WHERE " + strings.Join(conditions, " AND ")
}
// Count total.
var total int
countQuery := "SELECT COUNT(*) " + spoolFromJoins + " " + whereClause
if err := r.pool.QueryRow(ctx, countQuery, args...).Scan(&total); err != nil {
return nil, 0, err
}
// Query with pagination.
dataQuery := "SELECT " + spoolScanFields + " " + spoolFromJoins + " " +
whereClause +
" ORDER BY s.name ASC" +
fmt.Sprintf(" LIMIT $%d OFFSET $%d", argIdx, argIdx+1)
dataArgs := make([]interface{}, len(args))
copy(dataArgs, args)
dataArgs = append(dataArgs, filter.Limit, filter.Offset)
rows, err := r.pool.Query(ctx, dataQuery, dataArgs...)
if err != nil {
return nil, 0, err
}
defer rows.Close()
var spools []models.FilamentSpool
for rows.Next() {
s, err := scanSpoolWithJoins(rows)
if err != nil {
return nil, 0, err
}
spools = append(spools, s)
}
if err := rows.Err(); err != nil {
return nil, 0, err
}
if spools == nil {
spools = []models.FilamentSpool{}
}
return spools, total, nil
}
// GetByID returns a single filament spool by ID with JOINed data.
// Returns nil if not found or soft-deleted.
func (r *FilamentRepository) GetByID(ctx context.Context, id int) (*models.FilamentSpool, error) {
query := "SELECT " + spoolScanFields + " " + spoolFromJoins +
" WHERE s.id = $1 AND s.deleted_at IS NULL"
row := r.pool.QueryRow(ctx, query, id)
s, err := scanSpoolWithJoins(row)
if err != nil {
return nil, err
}
return &s, nil
}
// Create inserts a new filament spool and returns the created spool with JOINed data.
func (r *FilamentRepository) Create(ctx context.Context, spool *models.FilamentSpool) (*models.FilamentSpool, error) {
var id int
err := r.pool.QueryRow(ctx, `
INSERT INTO filament_spools (
name, material_base_id, material_finish_id, material_modifier_id,
color_hex, brand, diameter_mm, initial_grams, remaining_grams,
spool_weight_grams, cost_usd, low_stock_threshold_grams,
notes, barcode
) VALUES ($1,$2,$3,$4,$5,$6,$7,$8,$9,$10,$11,$12,$13,$14)
RETURNING id
`,
spool.Name, spool.MaterialBaseID, spool.MaterialFinishID, spool.MaterialModifierID,
spool.ColorHex, spool.Brand, spool.DiameterMM, spool.InitialGrams, spool.RemainingGrams,
spool.SpoolWeightGrams, spool.CostUSD, spool.LowStockThresholdGrams,
spool.Notes, spool.Barcode,
).Scan(&id)
if err != nil {
return nil, err
}
return r.GetByID(ctx, id)
}
// Update applies partial updates to an existing filament spool.
// Only non-nil fields in the update map are applied.
// Returns the updated spool.
func (r *FilamentRepository) Update(ctx context.Context, id int, updates map[string]interface{}) (*models.FilamentSpool, error) {
if len(updates) == 0 {
return r.GetByID(ctx, id)
}
setClauses := []string{"updated_at = NOW()"}
args := []interface{}{}
argIdx := 1
for col, val := range updates {
setClauses = append(setClauses, fmt.Sprintf("%s = $%d", col, argIdx))
args = append(args, val)
argIdx++
}
args = append(args, id)
query := fmt.Sprintf("UPDATE filament_spools SET %s WHERE id = $%d AND deleted_at IS NULL",
strings.Join(setClauses, ", "), argIdx)
result, err := r.pool.Exec(ctx, query, args...)
if err != nil {
return nil, err
}
if result.RowsAffected() == 0 {
return nil, nil // not found or deleted
}
return r.GetByID(ctx, id)
}
// SoftDelete marks a filament spool as deleted by setting deleted_at = NOW().
// Returns true if a row was affected.
func (r *FilamentRepository) SoftDelete(ctx context.Context, id int) (bool, error) {
result, err := r.pool.Exec(ctx, `
UPDATE filament_spools
SET deleted_at = NOW(), updated_at = NOW()
WHERE id = $1 AND deleted_at IS NULL
`, id)
if err != nil {
return false, err
}
return result.RowsAffected() > 0, nil
}

View File

@@ -0,0 +1,50 @@
package repositories
import (
"context"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/models"
"github.com/jackc/pgx/v5/pgxpool"
)
// MaterialFinishRepository handles database queries for material finishes.
type MaterialFinishRepository struct {
pool *pgxpool.Pool
}
// NewMaterialFinishRepository creates a MaterialFinishRepository backed by the given pool.
func NewMaterialFinishRepository(pool *pgxpool.Pool) *MaterialFinishRepository {
return &MaterialFinishRepository{pool: pool}
}
// GetAll returns all material finishes ordered by name.
func (r *MaterialFinishRepository) GetAll(ctx context.Context) ([]models.MaterialFinish, error) {
rows, err := r.pool.Query(ctx, `
SELECT id, name, description, created_at, updated_at
FROM material_finishes
ORDER BY name
`)
if err != nil {
return nil, err
}
defer rows.Close()
var finishes []models.MaterialFinish
for rows.Next() {
var f models.MaterialFinish
if err := rows.Scan(
&f.ID, &f.Name, &f.Description,
&f.CreatedAt, &f.UpdatedAt,
); err != nil {
return nil, err
}
finishes = append(finishes, f)
}
if err := rows.Err(); err != nil {
return nil, err
}
if finishes == nil {
finishes = []models.MaterialFinish{}
}
return finishes, nil
}

View File

@@ -0,0 +1,50 @@
package repositories
import (
"context"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/models"
"github.com/jackc/pgx/v5/pgxpool"
)
// MaterialModifierRepository handles database queries for material modifiers.
type MaterialModifierRepository struct {
pool *pgxpool.Pool
}
// NewMaterialModifierRepository creates a MaterialModifierRepository backed by the given pool.
func NewMaterialModifierRepository(pool *pgxpool.Pool) *MaterialModifierRepository {
return &MaterialModifierRepository{pool: pool}
}
// GetAll returns all material modifiers ordered by name.
func (r *MaterialModifierRepository) GetAll(ctx context.Context) ([]models.MaterialModifier, error) {
rows, err := r.pool.Query(ctx, `
SELECT id, name, description, created_at, updated_at
FROM material_modifiers
ORDER BY name
`)
if err != nil {
return nil, err
}
defer rows.Close()
var modifiers []models.MaterialModifier
for rows.Next() {
var m models.MaterialModifier
if err := rows.Scan(
&m.ID, &m.Name, &m.Description,
&m.CreatedAt, &m.UpdatedAt,
); err != nil {
return nil, err
}
modifiers = append(modifiers, m)
}
if err := rows.Err(); err != nil {
return nil, err
}
if modifiers == nil {
modifiers = []models.MaterialModifier{}
}
return modifiers, nil
}

View File

@@ -0,0 +1,54 @@
// Package repositories provides data access logic backed by PostgreSQL via pgxpool.
package repositories
import (
"context"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/models"
"github.com/jackc/pgx/v5/pgxpool"
)
// MaterialRepository handles database queries for material lookup tables.
type MaterialRepository struct {
pool *pgxpool.Pool
}
// NewMaterialRepository creates a MaterialRepository backed by the given pool.
func NewMaterialRepository(pool *pgxpool.Pool) *MaterialRepository {
return &MaterialRepository{pool: pool}
}
// GetAll returns all material bases ordered by name.
func (r *MaterialRepository) GetAll(ctx context.Context) ([]models.MaterialBase, error) {
rows, err := r.pool.Query(ctx, `
SELECT id, name, density_g_cm3, extrusion_temp_min, extrusion_temp_max,
bed_temp_min, bed_temp_max, created_at, updated_at
FROM material_bases
ORDER BY name
`)
if err != nil {
return nil, err
}
defer rows.Close()
var materials []models.MaterialBase
for rows.Next() {
var m models.MaterialBase
if err := rows.Scan(
&m.ID, &m.Name, &m.DensityGCm3,
&m.ExtrusionTempMin, &m.ExtrusionTempMax,
&m.BedTempMin, &m.BedTempMax,
&m.CreatedAt, &m.UpdatedAt,
); err != nil {
return nil, err
}
materials = append(materials, m)
}
if err := rows.Err(); err != nil {
return nil, err
}
if materials == nil {
materials = []models.MaterialBase{}
}
return materials, nil
}

View File

@@ -0,0 +1,157 @@
package repositories
import (
"context"
"fmt"
"strings"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/models"
"github.com/jackc/pgx/v5/pgxpool"
)
// PrintJobRepository handles database queries for print_jobs.
type PrintJobRepository struct {
pool *pgxpool.Pool
}
// NewPrintJobRepository creates a PrintJobRepository backed by the given pool.
func NewPrintJobRepository(pool *pgxpool.Pool) *PrintJobRepository {
return &PrintJobRepository{pool: pool}
}
// PrintJobFilter holds query parameters for listing print jobs.
type PrintJobFilter struct {
Status string // filter by job_status name (case-insensitive)
PrinterID *int // filter by printer_id
Limit int
Offset int
}
// scanPrintJobWithJoins scans a print_job row with JOINed tables.
func (r *PrintJobRepository) scanPrintJobWithJoins(row interface{ Scan(...interface{}) error }) (models.PrintJob, error) {
var pj models.PrintJob
var js models.JobStatus
err := row.Scan(
&pj.ID, &pj.PrinterID, &pj.FilamentSpoolID,
&pj.JobName, &pj.FileName,
&pj.JobStatusID,
&pj.StartedAt, &pj.CompletedAt,
&pj.DurationSeconds, &pj.EstimatedDurationSeconds,
&pj.TotalMMExtruded, &pj.TotalGramsUsed, &pj.TotalCostUSD,
&pj.Notes,
&pj.DeletedAt, &pj.CreatedAt, &pj.UpdatedAt,
&js.ID, &js.Name,
&js.CreatedAt, &js.UpdatedAt,
)
if err != nil {
return pj, err
}
pj.JobStatus = &js
return pj, nil
}
// GetAll returns print jobs matching the given filters, with pagination.
func (r *PrintJobRepository) GetAll(ctx context.Context, filter PrintJobFilter) ([]models.PrintJob, int, error) {
conditions := []string{"pj.deleted_at IS NULL"}
args := []interface{}{}
argIdx := 1
if filter.Status != "" {
conditions = append(conditions, fmt.Sprintf("LOWER(js.name) = LOWER($%d)", argIdx))
args = append(args, filter.Status)
argIdx++
}
if filter.PrinterID != nil {
conditions = append(conditions, fmt.Sprintf("pj.printer_id = $%d", argIdx))
args = append(args, *filter.PrinterID)
argIdx++
}
whereClause := ""
if len(conditions) > 0 {
whereClause = "WHERE " + strings.Join(conditions, " AND ")
}
// Count.
var total int
countQuery := `SELECT COUNT(*)
FROM print_jobs pj
LEFT JOIN job_statuses js ON pj.job_status_id = js.id
` + " " + whereClause
if err := r.pool.QueryRow(ctx, countQuery, args...).Scan(&total); err != nil {
return nil, 0, err
}
// Query with pagination.
dataQuery := `SELECT
pj.id, pj.printer_id, pj.filament_spool_id,
pj.job_name, pj.file_name,
pj.job_status_id,
pj.started_at, pj.completed_at,
pj.duration_seconds, pj.estimated_duration_seconds,
pj.total_mm_extruded, pj.total_grams_used, pj.total_cost_usd,
pj.notes,
pj.deleted_at, pj.created_at, pj.updated_at,
js.id, js.name,
js.created_at, js.updated_at
FROM print_jobs pj
LEFT JOIN job_statuses js ON pj.job_status_id = js.id
` + whereClause +
" ORDER BY pj.created_at DESC" +
fmt.Sprintf(" LIMIT $%d OFFSET $%d", argIdx, argIdx+1)
dataArgs := make([]interface{}, len(args))
copy(dataArgs, args)
dataArgs = append(dataArgs, filter.Limit, filter.Offset)
rows, err := r.pool.Query(ctx, dataQuery, dataArgs...)
if err != nil {
return nil, 0, err
}
defer rows.Close()
var jobs []models.PrintJob
for rows.Next() {
pj, err := r.scanPrintJobWithJoins(rows)
if err != nil {
return nil, 0, err
}
jobs = append(jobs, pj)
}
if err := rows.Err(); err != nil {
return nil, 0, err
}
if jobs == nil {
jobs = []models.PrintJob{}
}
return jobs, total, nil
}
// GetByID returns a single print job by ID with JOINed job_status.
func (r *PrintJobRepository) GetByID(ctx context.Context, id int) (*models.PrintJob, error) {
row := r.pool.QueryRow(ctx, `
SELECT
pj.id, pj.printer_id, pj.filament_spool_id,
pj.job_name, pj.file_name,
pj.job_status_id,
pj.started_at, pj.completed_at,
pj.duration_seconds, pj.estimated_duration_seconds,
pj.total_mm_extruded, pj.total_grams_used, pj.total_cost_usd,
pj.notes,
pj.deleted_at, pj.created_at, pj.updated_at,
js.id, js.name,
js.created_at, js.updated_at
FROM print_jobs pj
LEFT JOIN job_statuses js ON pj.job_status_id = js.id
WHERE pj.id = $1 AND pj.deleted_at IS NULL
`, id)
pj, err := r.scanPrintJobWithJoins(row)
if err != nil {
return nil, err
}
return &pj, nil
}

View File

@@ -0,0 +1,78 @@
package repositories
import (
"context"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/models"
"github.com/jackc/pgx/v5/pgxpool"
)
// PrinterRepository handles database queries for printers.
type PrinterRepository struct {
pool *pgxpool.Pool
}
// NewPrinterRepository creates a PrinterRepository backed by the given pool.
func NewPrinterRepository(pool *pgxpool.Pool) *PrinterRepository {
return &PrinterRepository{pool: pool}
}
// scanPrinterWithType scans a printer row with JOINed printer_type.
func (r *PrinterRepository) scanPrinterWithType(row interface{ Scan(...interface{}) error }) (models.Printer, error) {
var p models.Printer
var pt models.PrinterType
err := row.Scan(
&p.ID, &p.Name, &p.PrinterTypeID,
&p.Manufacturer, &p.Model,
&p.MoonrakerURL, &p.MoonrakerAPIKey,
&p.MQTTBrokerHost, &p.MQTTTopicPrefix,
&p.MQTTTLSEnabled, &p.IsActive,
&p.CreatedAt, &p.UpdatedAt,
&pt.ID, &pt.Name,
&pt.CreatedAt, &pt.UpdatedAt,
)
if err != nil {
return p, err
}
p.PrinterType = &pt
return p, nil
}
// GetAll returns all printers joined with their printer_type, ordered by name.
func (r *PrinterRepository) GetAll(ctx context.Context) ([]models.Printer, error) {
rows, err := r.pool.Query(ctx, `
SELECT p.id, p.name, p.printer_type_id,
p.manufacturer, p.model,
p.moonraker_url, p.moonraker_api_key,
p.mqtt_broker_host, p.mqtt_topic_prefix,
p.mqtt_tls_enabled, p.is_active,
p.created_at, p.updated_at,
pt.id, pt.name,
pt.created_at, pt.updated_at
FROM printers p
JOIN printer_types pt ON p.printer_type_id = pt.id
ORDER BY p.name
`)
if err != nil {
return nil, err
}
defer rows.Close()
var printers []models.Printer
for rows.Next() {
p, err := r.scanPrinterWithType(rows)
if err != nil {
return nil, err
}
printers = append(printers, p)
}
if err := rows.Err(); err != nil {
return nil, err
}
if printers == nil {
printers = []models.Printer{}
}
return printers, nil
}

View File

@@ -0,0 +1,96 @@
package repositories
import (
"context"
"fmt"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/models"
"github.com/jackc/pgx/v5/pgxpool"
)
// UsageLogRepository handles database queries for usage_logs.
type UsageLogRepository struct {
pool *pgxpool.Pool
}
// NewUsageLogRepository creates a UsageLogRepository backed by the given pool.
func NewUsageLogRepository(pool *pgxpool.Pool) *UsageLogRepository {
return &UsageLogRepository{pool: pool}
}
// UsageLogFilter holds query parameters for listing usage logs.
type UsageLogFilter struct {
SpoolID *int // filter by filament_spool_id
JobID *int // filter by print_job_id
Limit int
Offset int
}
// GetAll returns usage logs matching the given filters, with pagination.
func (r *UsageLogRepository) GetAll(ctx context.Context, filter UsageLogFilter) ([]models.UsageLog, int, error) {
conditions := []string{"1=1"}
args := []interface{}{}
argIdx := 1
if filter.SpoolID != nil {
conditions = append(conditions, fmt.Sprintf("ul.filament_spool_id = $%d", argIdx))
args = append(args, *filter.SpoolID)
argIdx++
}
if filter.JobID != nil {
conditions = append(conditions, fmt.Sprintf("ul.print_job_id = $%d", argIdx))
args = append(args, *filter.JobID)
argIdx++
}
whereClause := "WHERE " + fmt.Sprintf("%s", conditions[0])
for _, c := range conditions[1:] {
whereClause += " AND " + c
}
// Count.
var total int
countQuery := "SELECT COUNT(*) FROM usage_logs ul " + whereClause
if err := r.pool.QueryRow(ctx, countQuery, args...).Scan(&total); err != nil {
return nil, 0, err
}
// Query with pagination.
dataQuery := `SELECT id, print_job_id, filament_spool_id, mm_extruded,
grams_used, cost_usd, logged_at, created_at
FROM usage_logs ul
` + whereClause +
" ORDER BY ul.logged_at DESC" +
fmt.Sprintf(" LIMIT $%d OFFSET $%d", argIdx, argIdx+1)
dataArgs := make([]interface{}, len(args))
copy(dataArgs, args)
dataArgs = append(dataArgs, filter.Limit, filter.Offset)
rows, err := r.pool.Query(ctx, dataQuery, dataArgs...)
if err != nil {
return nil, 0, err
}
defer rows.Close()
var logs []models.UsageLog
for rows.Next() {
var l models.UsageLog
if err := rows.Scan(
&l.ID, &l.PrintJobID, &l.FilamentSpoolID,
&l.MMExtruded, &l.GramsUsed, &l.CostUSD,
&l.LoggedAt, &l.CreatedAt,
); err != nil {
return nil, 0, err
}
logs = append(logs, l)
}
if err := rows.Err(); err != nil {
return nil, 0, err
}
if logs == nil {
logs = []models.UsageLog{}
}
return logs, total, nil
}

View File

@@ -0,0 +1,96 @@
package router
import (
"net/http"
"time"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/config"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/handlers"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/repositories"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/services"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/sse"
"github.com/go-chi/chi/v5"
"github.com/go-chi/chi/v5/middleware"
"github.com/jackc/pgx/v5/pgxpool"
)
// New creates and configures a Chi router with all middleware and handlers mounted.
func New(cfg *config.Config, dbPool *pgxpool.Pool, sseBC *sse.Broadcaster) chi.Router {
r := chi.NewRouter()
// Middleware
r.Use(middleware.RequestID)
r.Use(middleware.RealIP)
r.Use(middleware.Logger)
r.Use(middleware.Recoverer)
// Timeout middleware is applied per-route below to exclude SSE
// CORS
r.Use(func(next http.Handler) http.Handler {
return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
w.Header().Set("Access-Control-Allow-Origin", cfg.CorsOrigin)
w.Header().Set("Access-Control-Allow-Methods", "GET, POST, PUT, DELETE, OPTIONS")
w.Header().Set("Access-Control-Allow-Headers", "Content-Type, Authorization")
if r.Method == http.MethodOptions {
w.WriteHeader(http.StatusNoContent)
return
}
next.ServeHTTP(w, r)
})
})
// Health check (with timeout)
healthHandler := handlers.NewHealthHandler(dbPool)
r.With(middleware.Timeout(30 * time.Second)).Get("/health", healthHandler.ServeHTTP)
// ── Repositories ──────────────────────────────────────────────────────
materialRepo := repositories.NewMaterialRepository(dbPool)
finishRepo := repositories.NewMaterialFinishRepository(dbPool)
modifierRepo := repositories.NewMaterialModifierRepository(dbPool)
filamentRepo := repositories.NewFilamentRepository(dbPool)
printerRepo := repositories.NewPrinterRepository(dbPool)
printJobRepo := repositories.NewPrintJobRepository(dbPool)
usageLogRepo := repositories.NewUsageLogRepository(dbPool)
// ── Services ──────────────────────────────────────────────────────────
filamentService := services.NewFilamentService(filamentRepo)
printerService := services.NewPrinterService(printerRepo)
printJobService := services.NewPrintJobService(printJobRepo)
// ── Handlers ──────────────────────────────────────────────────────────
materialHandler := handlers.NewMaterialHandler(materialRepo)
finishHandler := handlers.NewMaterialFinishHandler(finishRepo)
modifierHandler := handlers.NewMaterialModifierHandler(modifierRepo)
filamentHandler := handlers.NewFilamentHandler(filamentService)
printerHandler := handlers.NewPrinterHandler(printerService)
printJobHandler := handlers.NewPrintJobHandler(printJobService)
usageLogHandler := handlers.NewUsageLogHandler(usageLogRepo)
// ── API Routes (with timeout) ─────────────────────────────────────────
r.Route("/api", func(r chi.Router) {
r.Use(middleware.Timeout(60 * time.Second))
r.Get("/materials", materialHandler.List)
r.Get("/finishes", finishHandler.List)
r.Get("/modifiers", modifierHandler.List)
r.Route("/filaments", func(r chi.Router) {
r.Get("/", filamentHandler.List)
r.Post("/", filamentHandler.Create)
r.Route("/{id}", func(r chi.Router) {
r.Get("/", filamentHandler.Get)
r.Put("/", filamentHandler.Update)
r.Delete("/", filamentHandler.Delete)
})
})
r.Get("/printers", printerHandler.List)
r.Get("/print-jobs", printJobHandler.List)
r.Get("/usage-logs", usageLogHandler.List)
// SSE Events stream
sseHandler := sse.NewHandler(sseBC)
r.Get("/events", sseHandler.ServeHTTP)
})
return r
}

View File

@@ -0,0 +1,82 @@
// Package services contains business logic and application services.
package services
import (
"context"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/models"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/repositories"
)
// FilamentService wraps FilamentRepository with business logic and validation.
type FilamentService struct {
repo *repositories.FilamentRepository
}
// NewFilamentService creates a FilamentService backed by the given repository.
func NewFilamentService(repo *repositories.FilamentRepository) *FilamentService {
return &FilamentService{repo: repo}
}
// List returns paginated filament spools filtered by the given criteria.
func (s *FilamentService) List(ctx context.Context, filter repositories.FilamentFilter) ([]models.FilamentSpool, int, error) {
return s.repo.GetAll(ctx, filter)
}
// GetByID returns a single filament spool by ID.
func (s *FilamentService) GetByID(ctx context.Context, id int) (*models.FilamentSpool, error) {
return s.repo.GetByID(ctx, id)
}
// Create validates and creates a new filament spool.
func (s *FilamentService) Create(ctx context.Context, spool *models.FilamentSpool) (*models.FilamentSpool, error) {
if err := validateFilamentSpool(spool); err != nil {
return nil, err
}
return s.repo.Create(ctx, spool)
}
// Update applies partial updates to a filament spool after validation.
func (s *FilamentService) Update(ctx context.Context, id int, updates map[string]interface{}) (*models.FilamentSpool, error) {
return s.repo.Update(ctx, id, updates)
}
// SoftDelete marks a filament spool as deleted.
func (s *FilamentService) SoftDelete(ctx context.Context, id int) (bool, error) {
return s.repo.SoftDelete(ctx, id)
}
// PrinterService wraps PrinterRepository.
type PrinterService struct {
repo *repositories.PrinterRepository
}
// NewPrinterService creates a PrinterService backed by the given repository.
func NewPrinterService(repo *repositories.PrinterRepository) *PrinterService {
return &PrinterService{repo: repo}
}
// List returns all printers.
func (s *PrinterService) List(ctx context.Context) ([]models.Printer, error) {
return s.repo.GetAll(ctx)
}
// PrintJobService wraps PrintJobRepository.
type PrintJobService struct {
repo *repositories.PrintJobRepository
}
// NewPrintJobService creates a PrintJobService backed by the given repository.
func NewPrintJobService(repo *repositories.PrintJobRepository) *PrintJobService {
return &PrintJobService{repo: repo}
}
// List returns paginated print jobs filtered by the given criteria.
func (s *PrintJobService) List(ctx context.Context, filter repositories.PrintJobFilter) ([]models.PrintJob, int, error) {
return s.repo.GetAll(ctx, filter)
}
// GetByID returns a single print job by ID.
func (s *PrintJobService) GetByID(ctx context.Context, id int) (*models.PrintJob, error) {
return s.repo.GetByID(ctx, id)
}

View File

@@ -0,0 +1,74 @@
package services
import (
"errors"
"fmt"
"regexp"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/dtos"
"github.com/CubeCraft-Creations/Extrudex/backend/internal/models"
)
// colorHexPattern validates hex color strings like #FF0000 or #ff0000.
var colorHexPattern = regexp.MustCompile(`^#[0-9A-Fa-f]{6}$`)
// validateFilamentSpool performs validation on a FilamentSpool entity.
// Returns a descriptive error on failure.
func validateFilamentSpool(s *models.FilamentSpool) error {
if s.Name == "" {
return errors.New("name is required")
}
if s.MaterialBaseID <= 0 {
return errors.New("material_base_id is required")
}
if s.MaterialFinishID <= 0 {
return errors.New("material_finish_id is required")
}
if !colorHexPattern.MatchString(s.ColorHex) {
return fmt.Errorf("color_hex must be a valid hex color (e.g., #FF0000)")
}
if s.InitialGrams <= 0 {
return errors.New("initial_grams must be greater than 0")
}
if s.RemainingGrams < 0 {
return errors.New("remaining_grams must be >= 0")
}
return nil
}
// ValidateCreateFilamentRequest validates a creation DTO.
func ValidateCreateFilamentRequest(req dtos.CreateFilamentRequest) error {
if req.Name == "" {
return errors.New("name is required")
}
if req.MaterialBaseID <= 0 {
return errors.New("material_base_id is required")
}
if req.MaterialFinishID <= 0 {
return errors.New("material_finish_id is required")
}
if !colorHexPattern.MatchString(req.ColorHex) {
return fmt.Errorf("color_hex must be a valid hex color (e.g., #FF0000)")
}
if req.InitialGrams <= 0 {
return errors.New("initial_grams must be greater than 0")
}
if req.RemainingGrams < 0 {
return errors.New("remaining_grams must be >= 0")
}
return nil
}
// ValidateUpdateFilamentRequest validates partial update fields.
func ValidateUpdateFilamentRequest(req dtos.UpdateFilamentRequest) error {
if req.ColorHex != nil && !colorHexPattern.MatchString(*req.ColorHex) {
return fmt.Errorf("color_hex must be a valid hex color (e.g., #FF0000)")
}
if req.InitialGrams != nil && *req.InitialGrams <= 0 {
return errors.New("initial_grams must be greater than 0")
}
if req.RemainingGrams != nil && *req.RemainingGrams < 0 {
return errors.New("remaining_grams must be >= 0")
}
return nil
}

View File

@@ -0,0 +1,133 @@
package sse
import (
"log/slog"
"sync"
)
// client represents a single SSE subscriber — identified by its send channel.
type client struct {
ch chan string
}
// Broadcaster receives Events on its input channel and fans them out to every
// connected client. Subscribe adds a new client; Unsubscribe removes one.
// Start must be called before the broadcaster accepts events.
type Broadcaster struct {
input chan Event
subscribe chan client
unsubscribe chan client
clients map[chan string]struct{}
done chan struct{}
once sync.Once
}
// NewBroadcaster creates a Broadcaster. bufSize controls the buffer depth for
// the input channel as well as for each per-client outbound channel.
func NewBroadcaster(bufSize int) *Broadcaster {
if bufSize <= 0 {
bufSize = 64
}
return &Broadcaster{
input: make(chan Event, bufSize),
subscribe: make(chan client),
unsubscribe: make(chan client),
clients: make(map[chan string]struct{}),
done: make(chan struct{}),
}
}
// Publish pushes an event into the broadcaster. Safe for concurrent use.
func (b *Broadcaster) Publish(ev Event) {
select {
case b.input <- ev:
case <-b.done:
// Silently drop during shutdown.
}
}
// Start launches the broadcaster's fan-out loop in a goroutine.
// It must be called before Publish is used.
func (b *Broadcaster) Start() {
go b.loop()
}
// Stop terminates the fan-out loop and closes all client channels.
// It is safe to call multiple times.
func (b *Broadcaster) Stop() {
b.once.Do(func() {
close(b.done)
})
}
// Subscribe returns a new client channel that receives SSE-formatted strings.
func (b *Broadcaster) Subscribe() chan string {
c := client{ch: make(chan string, 64)}
select {
case b.subscribe <- c:
case <-b.done:
// Broadcaster already stopped — return a closed chan so the handler
// can bail out quickly.
ch := make(chan string)
close(ch)
return ch
}
return c.ch
}
// Unsubscribe removes a client channel and closes it.
func (b *Broadcaster) Unsubscribe(ch chan string) {
c := client{ch: ch}
select {
case b.unsubscribe <- c:
case <-b.done:
// Already shutting down — channels will be cleaned up by Stop.
}
}
// loop is the core fan-out goroutine.
func (b *Broadcaster) loop() {
for {
select {
case ev := <-b.input:
sse := ev.toSSE()
for ch := range b.clients {
// Non-blocking send — slow clients are dropped.
select {
case ch <- sse:
default:
slog.Warn("sse broadcaster: dropping event for slow client", "type", ev.Type)
}
}
case c := <-b.subscribe:
b.clients[c.ch] = struct{}{}
slog.Debug("sse broadcaster: client connected", "total_clients", len(b.clients))
case c := <-b.unsubscribe:
if _, ok := b.clients[c.ch]; ok {
delete(b.clients, c.ch)
close(c.ch)
slog.Debug("sse broadcaster: client disconnected", "total_clients", len(b.clients))
}
case <-b.done:
// Drain remaining events in input before shutting down.
for ev := range b.input {
sse := ev.toSSE()
for ch := range b.clients {
select {
case ch <- sse:
default:
}
}
}
// Close all remaining client channels.
for ch := range b.clients {
close(ch)
}
b.clients = nil
return
}
}
}

View File

@@ -0,0 +1,92 @@
// Package sse provides Server-Sent Events infrastructure for real-time updates.
// Includes event types, a central broadcaster, and an HTTP handler.
package sse
import (
"encoding/json"
"time"
)
// EventType identifies the category of an SSE event.
type EventType string
const (
EventPrinterStatus EventType = "printer.status"
EventJobStarted EventType = "job.started"
EventJobCompleted EventType = "job.completed"
EventFilamentLow EventType = "filament.low"
)
// Event is a JSON-serializable SSE event pushed through the broadcaster.
type Event struct {
Type EventType `json:"type"`
Payload json.RawMessage `json:"payload"`
Timestamp time.Time `json:"timestamp"`
}
// PrinterStatusPayload carries printer online/offline/printing state.
type PrinterStatusPayload struct {
PrinterID int `json:"printer_id"`
PrinterName string `json:"printer_name"`
Status string `json:"status"` // "online", "offline", "printing"
}
// JobStartedPayload carries initial print job info.
type JobStartedPayload struct {
JobID int `json:"job_id"`
JobName string `json:"job_name"`
PrinterID int `json:"printer_id"`
SpoolID *int `json:"spool_id,omitempty"`
}
// JobCompletedPayload carries final print job data including usage.
type JobCompletedPayload struct {
JobID int `json:"job_id"`
JobName string `json:"job_name"`
PrinterID int `json:"printer_id"`
DurationSeconds *int `json:"duration_seconds,omitempty"`
TotalGramsUsed *float64 `json:"total_grams_used,omitempty"`
TotalCostUSD *float64 `json:"total_cost_usd,omitempty"`
}
// FilamentLowPayload alerts that a spool is below its threshold.
type FilamentLowPayload struct {
SpoolID int `json:"spool_id"`
SpoolName string `json:"spool_name"`
RemainingGrams int `json:"remaining_grams"`
ThresholdGrams int `json:"threshold_grams"`
}
// NewEvent creates an Event with the current timestamp from a typed payload.
func NewEvent(eventType EventType, payload any) (Event, error) {
raw, err := json.Marshal(payload)
if err != nil {
return Event{}, err
}
return Event{
Type: eventType,
Payload: raw,
Timestamp: time.Now().UTC(),
}, nil
}
// MustEvent creates an Event and panics on marshal failure (for use with
// known-good payloads in tests and internal wiring).
func MustEvent(eventType EventType, payload any) Event {
ev, err := NewEvent(eventType, payload)
if err != nil {
panic("sse.MustEvent: failed to marshal payload: " + err.Error())
}
return ev
}
// toSSE formats this Event as a standard SSE message string ready to be
// written to a response writer. The format is:
//
// event: <type>
// data: <json>
//
func (e Event) toSSE() string {
data, _ := json.Marshal(e)
return "event: " + string(e.Type) + "\n" + "data: " + string(data) + "\n\n"
}

View File

@@ -0,0 +1,59 @@
package sse
import (
"net/http"
)
// Handler is the HTTP handler for the GET /api/events SSE stream.
// It registers a client with the broadcaster, streams events as they arrive,
// and unregisters on disconnect.
type Handler struct {
bc *Broadcaster
}
// NewHandler creates a Handler backed by the given Broadcaster.
func NewHandler(bc *Broadcaster) *Handler {
return &Handler{bc: bc}
}
// ServeHTTP implements the SSE streaming endpoint.
// Flusher is required; clients that do not support flushing receive a 501.
func (h *Handler) ServeHTTP(w http.ResponseWriter, r *http.Request) {
flusher, ok := w.(http.Flusher)
if !ok {
http.Error(w, "streaming not supported", http.StatusNotImplemented)
return
}
// SSE-specific headers
w.Header().Set("Content-Type", "text/event-stream")
w.Header().Set("Cache-Control", "no-cache")
w.Header().Set("Connection", "keep-alive")
w.Header().Set("X-Accel-Buffering", "no") // Disable nginx buffering
// Write headers immediately
flusher.Flush()
// Subscribe to the broadcaster
ch := h.bc.Subscribe()
defer h.bc.Unsubscribe(ch)
// Use request context for cancellation when the client disconnects.
ctx := r.Context()
for {
select {
case <-ctx.Done():
return
case msg, ok := <-ch:
if !ok {
return
}
_, err := w.Write([]byte(msg))
if err != nil {
return
}
flusher.Flush()
}
}
}

View File

@@ -0,0 +1,19 @@
-- Migration: 000001_initial_schema (rollback)
-- Description: Drop all tables and indexes created in the initial schema migration
-- Author: Hex
-- Date: 2026-05-06
BEGIN;
DROP TABLE IF EXISTS usage_logs CASCADE;
DROP TABLE IF EXISTS print_jobs CASCADE;
DROP TABLE IF EXISTS filament_spools CASCADE;
DROP TABLE IF EXISTS printers CASCADE;
DROP TABLE IF EXISTS settings CASCADE;
DROP TABLE IF EXISTS material_modifiers CASCADE;
DROP TABLE IF EXISTS material_finishes CASCADE;
DROP TABLE IF EXISTS material_bases CASCADE;
DROP TABLE IF EXISTS job_statuses CASCADE;
DROP TABLE IF EXISTS printer_types CASCADE;
COMMIT;

View File

@@ -0,0 +1,231 @@
-- Migration: 000001_initial_schema
-- Description: Create initial Extrudex schema — lookup tables, core entities, and settings
-- Author: Hex
-- Date: 2026-05-06
--
-- Design decisions:
-- - Lookup tables for material_base, material_finish, material_modifier (no free-text enums)
-- - Lookup tables for printer_type and job_status (extensible, no hard-coded enum values)
-- - FK ON DELETE: RESTRICT on critical parents (material_base, material_finish, printer),
-- SET NULL on optional parents (modifier, spool on print_jobs),
-- CASCADE for usage_logs when parent job is deleted
-- - Soft-delete (deleted_at) on spools and print_jobs for safety
-- - JSONB config column on settings for flexible app-wide configuration
-- - All identifiers snake_case per project convention
BEGIN;
-- ============================================================================
-- Lookup Tables
-- ============================================================================
-- Printer types (fdm, resin, etc.) — extensible, not a raw enum
CREATE TABLE printer_types (
id SERIAL PRIMARY KEY,
name VARCHAR(50) NOT NULL UNIQUE,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
-- Job statuses (pending, printing, paused, completed, failed, cancelled)
CREATE TABLE job_statuses (
id SERIAL PRIMARY KEY,
name VARCHAR(50) NOT NULL UNIQUE,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
-- Material base types (PLA, PETG, ABS, TPU, ASA, Nylon, PC)
CREATE TABLE material_bases (
id SERIAL PRIMARY KEY,
name VARCHAR(100) NOT NULL UNIQUE,
density_g_cm3 DECIMAL(5,3) NOT NULL,
extrusion_temp_min INT,
extrusion_temp_max INT,
bed_temp_min INT,
bed_temp_max INT,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
-- Material finishes (Basic, Silk, Matte, Glossy, Satin)
CREATE TABLE material_finishes (
id SERIAL PRIMARY KEY,
name VARCHAR(100) NOT NULL UNIQUE,
description TEXT,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
-- Material modifiers (Wood-Filled, Carbon Fiber, Glow-in-Dark, Marble)
CREATE TABLE material_modifiers (
id SERIAL PRIMARY KEY,
name VARCHAR(100) NOT NULL UNIQUE,
description TEXT,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
-- ============================================================================
-- Core Entity Tables
-- ============================================================================
-- 3D printers in the fleet
CREATE TABLE printers (
id SERIAL PRIMARY KEY,
name VARCHAR(255) NOT NULL,
printer_type_id INT NOT NULL,
manufacturer VARCHAR(255),
model VARCHAR(255),
moonraker_url VARCHAR(512),
moonraker_api_key VARCHAR(512),
mqtt_broker_host VARCHAR(255),
mqtt_topic_prefix VARCHAR(255),
mqtt_tls_enabled BOOLEAN NOT NULL DEFAULT FALSE,
is_active BOOLEAN NOT NULL DEFAULT TRUE,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
CONSTRAINT fk_printers_printer_type
FOREIGN KEY (printer_type_id) REFERENCES printer_types(id)
ON DELETE RESTRICT
);
-- Filament spools — the core inventory item
CREATE TABLE filament_spools (
id SERIAL PRIMARY KEY,
name VARCHAR(255) NOT NULL,
material_base_id INT NOT NULL,
material_finish_id INT NOT NULL DEFAULT 1, -- "Basic" (seed data populates this first)
material_modifier_id INT,
color_hex VARCHAR(7) NOT NULL CHECK (color_hex ~ '^#[0-9A-Fa-f]{6}$'),
brand VARCHAR(255),
diameter_mm DECIMAL(4,2) NOT NULL DEFAULT 1.75,
initial_grams INT NOT NULL CHECK (initial_grams > 0),
remaining_grams INT NOT NULL CHECK (remaining_grams >= 0),
spool_weight_grams INT, -- measured empty-spool weight (tare), nullable
cost_usd DECIMAL(10,2),
low_stock_threshold_grams INT NOT NULL DEFAULT 50,
notes TEXT,
barcode VARCHAR(255) UNIQUE,
deleted_at TIMESTAMPTZ,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
CONSTRAINT fk_spools_material_base
FOREIGN KEY (material_base_id) REFERENCES material_bases(id)
ON DELETE RESTRICT,
CONSTRAINT fk_spools_material_finish
FOREIGN KEY (material_finish_id) REFERENCES material_finishes(id)
ON DELETE RESTRICT,
CONSTRAINT fk_spools_material_modifier
FOREIGN KEY (material_modifier_id) REFERENCES material_modifiers(id)
ON DELETE SET NULL
);
-- Print jobs — each job is one print on one printer
CREATE TABLE print_jobs (
id SERIAL PRIMARY KEY,
printer_id INT NOT NULL,
filament_spool_id INT, -- nullable: a job may use multiple spools (captured in usage_logs)
job_name VARCHAR(255) NOT NULL,
file_name VARCHAR(512),
job_status_id INT NOT NULL DEFAULT 1, -- "pending"
started_at TIMESTAMPTZ,
completed_at TIMESTAMPTZ,
duration_seconds INT,
estimated_duration_seconds INT,
total_mm_extruded DECIMAL(12,2),
total_grams_used DECIMAL(10,2),
total_cost_usd DECIMAL(10,4),
notes TEXT,
deleted_at TIMESTAMPTZ,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
CONSTRAINT fk_print_jobs_printer
FOREIGN KEY (printer_id) REFERENCES printers(id)
ON DELETE RESTRICT,
CONSTRAINT fk_print_jobs_spool
FOREIGN KEY (filament_spool_id) REFERENCES filament_spools(id)
ON DELETE SET NULL,
CONSTRAINT fk_print_jobs_status
FOREIGN KEY (job_status_id) REFERENCES job_statuses(id)
ON DELETE RESTRICT
);
-- Usage logs — granular tracking of filament consumed per job, per spool
CREATE TABLE usage_logs (
id SERIAL PRIMARY KEY,
print_job_id INT NOT NULL,
filament_spool_id INT NOT NULL,
mm_extruded DECIMAL(12,2) NOT NULL CHECK (mm_extruded > 0),
grams_used DECIMAL(10,2) NOT NULL CHECK (grams_used > 0),
cost_usd DECIMAL(10,4),
logged_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
CONSTRAINT fk_usage_logs_print_job
FOREIGN KEY (print_job_id) REFERENCES print_jobs(id)
ON DELETE CASCADE,
CONSTRAINT fk_usage_logs_spool
FOREIGN KEY (filament_spool_id) REFERENCES filament_spools(id)
ON DELETE RESTRICT
);
-- ============================================================================
-- Application Settings
-- ============================================================================
CREATE TABLE settings (
id SERIAL PRIMARY KEY,
key VARCHAR(255) NOT NULL UNIQUE,
value JSONB NOT NULL,
description TEXT,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
-- ============================================================================
-- Indexes
-- ============================================================================
-- Filament spools — query patterns: lookup by material, low-stock scans, barcode scans
CREATE INDEX ix_spools_material_base_id ON filament_spools(material_base_id);
CREATE INDEX ix_spools_material_finish_id ON filament_spools(material_finish_id);
CREATE INDEX ix_spools_material_modifier_id ON filament_spools(material_modifier_id);
CREATE INDEX ix_spools_remaining_grams ON filament_spools(remaining_grams)
WHERE deleted_at IS NULL; -- partial index: only active spools for low-stock queries
CREATE INDEX ix_spools_barcode ON filament_spools(barcode)
WHERE barcode IS NOT NULL AND deleted_at IS NULL;
CREATE INDEX ix_spools_deleted_at ON filament_spools(deleted_at)
WHERE deleted_at IS NOT NULL; -- small index for soft-delete filtering
-- Printers
CREATE INDEX ix_printers_printer_type_id ON printers(printer_type_id);
CREATE INDEX ix_printers_is_active ON printers(is_active)
WHERE is_active = TRUE; -- partial index for fleet dashboard queries
-- Print jobs — query by printer, status, date range, and soft-delete filter
CREATE INDEX ix_print_jobs_printer_id ON print_jobs(printer_id);
CREATE INDEX ix_print_jobs_spool_id ON print_jobs(filament_spool_id)
WHERE filament_spool_id IS NOT NULL;
CREATE INDEX ix_print_jobs_status_id ON print_jobs(job_status_id);
CREATE INDEX ix_print_jobs_created_at ON print_jobs(created_at DESC);
CREATE INDEX ix_print_jobs_deleted_at ON print_jobs(deleted_at)
WHERE deleted_at IS NOT NULL;
-- Usage logs — always queried by job or spool
CREATE INDEX ix_usage_logs_print_job_id ON usage_logs(print_job_id);
CREATE INDEX ix_usage_logs_spool_id ON usage_logs(filament_spool_id);
CREATE INDEX ix_usage_logs_logged_at ON usage_logs(logged_at DESC);
-- Settings — key lookups
CREATE INDEX ix_settings_key ON settings(key);
COMMIT;

View File

@@ -0,0 +1,15 @@
-- Migration: 000002_seed_data (rollback)
-- Description: Remove seed data inserted in 000002
-- Author: Hex
-- Date: 2026-05-06
BEGIN;
DELETE FROM settings WHERE key IN ('default_low_stock_threshold_grams', 'default_diameter_mm', 'filament_cross_section_area_mm2');
DELETE FROM material_modifiers WHERE id IN (1, 2, 3, 4);
DELETE FROM material_finishes WHERE id IN (1, 2, 3, 4, 5);
DELETE FROM material_bases WHERE id IN (1, 2, 3, 4, 5, 6, 7);
DELETE FROM job_statuses WHERE id IN (1, 2, 3, 4, 5, 6);
DELETE FROM printer_types WHERE id IN (1, 2);
COMMIT;

View File

@@ -0,0 +1,95 @@
-- Seed Data: Extrudex common reference data
-- Author: Hex
-- Date: 2026-05-06
--
-- IMPORTANT: IDs are explicitly assigned to satisfy the DEFAULT constraints:
-- - filament_spools.material_finish_id DEFAULT 1 ("Basic")
-- - print_jobs.job_status_id DEFAULT 1 ("pending")
--
-- Density values sourced from common manufacturer specifications.
-- Temperature ranges are conservative/typical; users can override per-spool.
BEGIN;
-- ============================================================================
-- Printer Types
-- ============================================================================
INSERT INTO printer_types (id, name) VALUES
(1, 'fdm'),
(2, 'resin')
ON CONFLICT (id) DO NOTHING;
-- Reset the sequence so future inserts start after our explicit IDs
SELECT setval('printer_types_id_seq', GREATEST(2, (SELECT MAX(id) FROM printer_types)));
-- ============================================================================
-- Job Statuses
-- ============================================================================
INSERT INTO job_statuses (id, name) VALUES
(1, 'pending'),
(2, 'printing'),
(3, 'paused'),
(4, 'completed'),
(5, 'failed'),
(6, 'cancelled')
ON CONFLICT (id) DO NOTHING;
SELECT setval('job_statuses_id_seq', GREATEST(6, (SELECT MAX(id) FROM job_statuses)));
-- ============================================================================
-- Material Bases (common filament types)
-- ============================================================================
INSERT INTO material_bases (id, name, density_g_cm3, extrusion_temp_min, extrusion_temp_max, bed_temp_min, bed_temp_max) VALUES
(1, 'PLA', 1.24, 190, 220, 0, 60),
(2, 'PETG', 1.27, 230, 250, 70, 90),
(3, 'ABS', 1.04, 230, 260, 90, 110),
(4, 'TPU', 1.21, 220, 250, 0, 60),
(5, 'ASA', 1.07, 240, 260, 90, 110),
(6, 'Nylon', 1.14, 240, 280, 70, 100),
(7, 'PC', 1.20, 260, 310, 90, 120)
ON CONFLICT (id) DO NOTHING;
SELECT setval('material_bases_id_seq', GREATEST(7, (SELECT MAX(id) FROM material_bases)));
-- ============================================================================
-- Material Finishes
-- ============================================================================
-- ID 1 = "Basic" is the default for new spools (DEFAULT 1 constraint)
INSERT INTO material_finishes (id, name, description) VALUES
(1, 'Basic', 'Standard solid-color filament with no special finish'),
(2, 'Silk', 'Glossy silk-like sheen, often used for decorative prints'),
(3, 'Matte', 'Flat non-reflective surface finish'),
(4, 'Glossy', 'High-shine reflective surface'),
(5, 'Satin', 'Semi-gloss between matte and glossy')
ON CONFLICT (id) DO NOTHING;
SELECT setval('material_finishes_id_seq', GREATEST(5, (SELECT MAX(id) FROM material_finishes)));
-- ============================================================================
-- Material Modifiers
-- ============================================================================
INSERT INTO material_modifiers (id, name, description) VALUES
(1, 'Wood-Filled', 'Contains wood fibers for natural wood-like appearance and texture'),
(2, 'Carbon Fiber', 'Reinforced with carbon fibers for increased stiffness and strength'),
(3, 'Glow-in-Dark', 'Phosphorescent additive that glows after exposure to light'),
(4, 'Marble', 'Contains specks for a stone-like marble appearance')
ON CONFLICT (id) DO NOTHING;
SELECT setval('material_modifiers_id_seq', GREATEST(4, (SELECT MAX(id) FROM material_modifiers)));
-- ============================================================================
-- Default Application Settings
-- ============================================================================
INSERT INTO settings (key, value, description) VALUES
('default_low_stock_threshold_grams', '50', 'Default grams threshold for low-stock alerts on new spools'),
('default_diameter_mm', '1.75', 'Default filament diameter for new spools (1.75mm is the modern standard)'),
('filament_cross_section_area_mm2', '2.405', 'Cross-sectional area for 1.75mm filament: π × (1.75/2)²')
ON CONFLICT (key) DO NOTHING;
COMMIT;

34
deploy.sh Executable file
View File

@@ -0,0 +1,34 @@
#!/bin/bash
set -e
echo "🔧 Deploying Extrudex Docker runtime..."
# Check if Docker Compose is available
if ! command -v docker-compose &> /dev/null && ! docker compose version &> /dev/null; then
echo "❌ Docker Compose is not installed"
exit 1
fi
COMPOSE_CMD="docker compose"
if command -v docker-compose &> /dev/null; then
COMPOSE_CMD="docker-compose"
fi
echo "📦 Building and starting services..."
$COMPOSE_CMD -f docker-compose.dev.yml up -d --build
echo "⏳ Waiting for services to become healthy..."
sleep 15
echo "✅ Deployment complete!"
echo ""
echo "Services running:"
echo " • PostgreSQL: localhost:5433"
echo " • Extrudex API: http://localhost:5080"
echo " • Extrudex Web: http://localhost:5081"
echo ""
echo "To view logs:"
echo " $COMPOSE_CMD -f docker-compose.dev.yml logs -f"
echo ""
echo "To stop:"
echo " $COMPOSE_CMD -f docker-compose.dev.yml down"

70
docker-compose.dev.yml Normal file
View File

@@ -0,0 +1,70 @@
services:
extrudex-db:
image: postgres:16-alpine
container_name: extrudex-db
environment:
POSTGRES_USER: extrudex
POSTGRES_PASSWORD: changeme
POSTGRES_DB: extrudex
ports:
- "5433:5432"
volumes:
- extrudex-db-data:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U extrudex"]
interval: 10s
timeout: 5s
retries: 5
start_period: 10s
restart: unless-stopped
networks:
- extrudex-network
extrudex-api:
build:
context: ./backend
dockerfile: Dockerfile
container_name: extrudex-api
ports:
- "5080:8080"
environment:
- ASPNETCORE_ENVIRONMENT=Development
- ASPNETCORE_URLS=http://+:8080
- EXTRUDEX_DB_HOST=extrudex-db
- EXTRUDEX_DB_PORT=5432
- EXTRUDEX_DB_NAME=extrudex
- EXTRUDEX_DB_USER=extrudex
- EXTRUDEX_DB_PASSWORD=changeme
depends_on:
extrudex-db:
condition: service_healthy
restart: unless-stopped
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:8080/health"]
interval: 30s
timeout: 10s
retries: 3
start_period: 40s
networks:
- extrudex-network
extrudex-web:
build:
context: ./frontend
dockerfile: Dockerfile
container_name: extrudex-web
ports:
- "5081:80"
depends_on:
extrudex-api:
condition: service_healthy
restart: unless-stopped
networks:
- extrudex-network
volumes:
extrudex-db-data:
networks:
extrudex-network:
driver: bridge

14
frontend/Dockerfile Normal file
View File

@@ -0,0 +1,14 @@
# Build stage
FROM node:22-alpine AS builder
WORKDIR /app
COPY package*.json ./
RUN npm ci
COPY . .
RUN npm run build
# Serve stage
FROM nginx:alpine
COPY --from=builder /app/dist /usr/share/nginx/html
COPY nginx.conf /etc/nginx/conf.d/default.conf
EXPOSE 80
CMD ["nginx", "-g", "daemon off;"]

28
frontend/eslint.config.js Normal file
View File

@@ -0,0 +1,28 @@
import js from '@eslint/js'
import globals from 'globals'
import reactHooks from 'eslint-plugin-react-hooks'
import reactRefresh from 'eslint-plugin-react-refresh'
import tseslint from 'typescript-eslint'
export default tseslint.config(
{ ignores: ['dist'] },
{
extends: [js.configs.recommended, ...tseslint.configs.recommended],
files: ['**/*.{ts,tsx}'],
languageOptions: {
ecmaVersion: 2020,
globals: globals.browser,
},
plugins: {
'react-hooks': reactHooks,
'react-refresh': reactRefresh,
},
rules: {
...reactHooks.configs.recommended.rules,
'react-refresh/only-export-components': [
'warn',
{ allowConstantExport: true },
],
},
},
)

Some files were not shown because too many files have changed in this diff Show More