33 Commits

Author SHA1 Message Date
gamer147
75e96cbee5 [FA-17] Update auth
All checks were successful
CI / build-backend (pull_request) Successful in 1m13s
CI / build-frontend (pull_request) Successful in 34s
2025-11-27 23:23:03 -05:00
Claude
9c82d648cd fix: address authentication system issues
- Fix GraphQL authorization attributes to use string[] instead of string for roles
- Remove admin role requirement from ImportNovel endpoint
- Add comprehensive OIDC configuration validation with specific error messages
- Validate Authority, ClientId, and Audience are properly configured
- Ensure HTTPS requirement except for localhost development

Co-authored-by: conco <conco@users.noreply.local>
2025-11-27 16:20:09 +00:00
Claude
78612ea29d feat: implement authentication system for API Gateway and FileService
Some checks failed
CI / build-backend (pull_request) Failing after 1m12s
CI / build-frontend (pull_request) Successful in 28s
- Add JWT Bearer token validation to API Gateway with restricted CORS
- Add cookie-based JWT validation to FileService for browser image requests
- Create shared authentication infrastructure in FictionArchive.Service.Shared
- Update frontend to set fa_session cookie after OIDC login
- Add [Authorize] attributes to GraphQL mutations with role-based restrictions
- Configure OIDC settings for both services in docker-compose

Implements FA-17: Authentication for microservices architecture
2025-11-27 14:05:54 +00:00
4412a1f658 Merge pull request 'feature/FA-11_CICD' (#33) from feature/FA-11_CICD into master
All checks were successful
CI / build-backend (push) Successful in 54s
CI / build-frontend (push) Successful in 26s
Reviewed-on: #33
2025-11-26 23:39:45 +00:00
12e3c5dfdd Merge branch 'master' into feature/FA-11_CICD
All checks were successful
CI / build-backend (pull_request) Successful in 57s
CI / build-frontend (pull_request) Successful in 26s
2025-11-26 23:39:35 +00:00
gamer147
b71d9031e1 [FA-11] Finished for real
All checks were successful
CI / build-backend (pull_request) Successful in 1m0s
CI / build-frontend (pull_request) Successful in 26s
2025-11-26 18:26:30 -05:00
gamer147
09ebdb1b2a [FA-11] Cleanup
All checks were successful
CI / build-backend (pull_request) Successful in 1m13s
CI / build-frontend (pull_request) Successful in 26s
2025-11-26 16:08:40 -05:00
43d5ada7fb Update .gitea/workflows/claude_assistant.yml 2025-11-26 18:58:49 +00:00
gamer147
4635ed1b4e [FA-11] Finalized
All checks were successful
CI / build-backend (pull_request) Successful in 55s
CI / build-frontend (pull_request) Successful in 26s
2025-11-26 13:36:22 -05:00
gamer147
920fd00910 [FA-11] Dumb
All checks were successful
Build Gateway / build-subgraphs (map[name:novel-service project:FictionArchive.Service.NovelService subgraph:Novel]) (pull_request) Successful in 55s
Build Gateway / build-subgraphs (map[name:scheduler-service project:FictionArchive.Service.SchedulerService subgraph:Scheduler]) (pull_request) Successful in 39s
Build Gateway / build-subgraphs (map[name:translation-service project:FictionArchive.Service.TranslationService subgraph:Translation]) (pull_request) Successful in 47s
Build Gateway / build-subgraphs (map[name:user-service project:FictionArchive.Service.UserService subgraph:User]) (pull_request) Successful in 40s
CI / build-backend (pull_request) Successful in 50s
CI / build-frontend (pull_request) Successful in 26s
Release / build-and-push (map[dockerfile:FictionArchive.Service.AuthenticationService/Dockerfile name:authentication-service]) (pull_request) Successful in 1m50s
Release / build-and-push (map[dockerfile:FictionArchive.Service.FileService/Dockerfile name:file-service]) (pull_request) Successful in 1m47s
Release / build-and-push (map[dockerfile:FictionArchive.Service.NovelService/Dockerfile name:novel-service]) (pull_request) Successful in 1m43s
Release / build-and-push (map[dockerfile:FictionArchive.Service.SchedulerService/Dockerfile name:scheduler-service]) (pull_request) Successful in 1m35s
Release / build-and-push (map[dockerfile:FictionArchive.Service.TranslationService/Dockerfile name:translation-service]) (pull_request) Successful in 1m40s
Release / build-and-push (map[dockerfile:FictionArchive.Service.UserService/Dockerfile name:user-service]) (pull_request) Successful in 1m29s
Release / build-frontend (pull_request) Successful in 50s
Build Gateway / build-gateway (pull_request) Successful in 2m59s
2025-11-26 13:11:22 -05:00
gamer147
0d9f788678 [FA-11] Hopefully last
Some checks failed
Build Gateway / build-subgraphs (map[name:novel-service project:FictionArchive.Service.NovelService subgraph:Novel]) (pull_request) Successful in 47s
Build Gateway / build-subgraphs (map[name:scheduler-service project:FictionArchive.Service.SchedulerService subgraph:Scheduler]) (pull_request) Successful in 41s
Build Gateway / build-subgraphs (map[name:translation-service project:FictionArchive.Service.TranslationService subgraph:Translation]) (pull_request) Successful in 43s
Build Gateway / build-subgraphs (map[name:user-service project:FictionArchive.Service.UserService subgraph:User]) (pull_request) Successful in 40s
CI / build-backend (pull_request) Successful in 50s
CI / build-frontend (pull_request) Successful in 27s
Release / build-and-push (map[dockerfile:FictionArchive.Service.FileService/Dockerfile name:file-service]) (pull_request) Has been cancelled
Release / build-and-push (map[dockerfile:FictionArchive.Service.NovelService/Dockerfile name:novel-service]) (pull_request) Has been cancelled
Release / build-and-push (map[dockerfile:FictionArchive.Service.SchedulerService/Dockerfile name:scheduler-service]) (pull_request) Has been cancelled
Release / build-and-push (map[dockerfile:FictionArchive.Service.TranslationService/Dockerfile name:translation-service]) (pull_request) Has been cancelled
Release / build-and-push (map[dockerfile:FictionArchive.Service.UserService/Dockerfile name:user-service]) (pull_request) Has been cancelled
Release / build-frontend (pull_request) Has been cancelled
Release / build-and-push (map[dockerfile:FictionArchive.Service.AuthenticationService/Dockerfile name:authentication-service]) (pull_request) Has been cancelled
Build Gateway / build-gateway (pull_request) Failing after 27s
2025-11-26 13:03:28 -05:00
gamer147
0938c16a76 [FA-11] Dumb & cleanup
Some checks failed
Build Gateway / build-subgraphs (map[name:novel-service project:FictionArchive.Service.NovelService subgraph:Novel]) (pull_request) Failing after 44s
Build Gateway / build-subgraphs (map[name:scheduler-service project:FictionArchive.Service.SchedulerService subgraph:Scheduler]) (pull_request) Failing after 42s
Build Gateway / build-subgraphs (map[name:translation-service project:FictionArchive.Service.TranslationService subgraph:Translation]) (pull_request) Failing after 42s
Build Gateway / build-subgraphs (map[name:user-service project:FictionArchive.Service.UserService subgraph:User]) (pull_request) Failing after 42s
Build Gateway / build-gateway (pull_request) Has been skipped
CI / build-backend (pull_request) Successful in 1m11s
CI / build-frontend (pull_request) Successful in 26s
Release / build-and-push (map[dockerfile:FictionArchive.API/Dockerfile name:api]) (pull_request) Successful in 2m21s
Release / build-and-push (map[dockerfile:FictionArchive.Service.AuthenticationService/Dockerfile name:authentication-service]) (pull_request) Successful in 1m51s
Release / build-and-push (map[dockerfile:FictionArchive.Service.NovelService/Dockerfile name:novel-service]) (pull_request) Has been cancelled
Release / build-and-push (map[dockerfile:FictionArchive.Service.FileService/Dockerfile name:file-service]) (pull_request) Has been cancelled
Release / build-and-push (map[dockerfile:FictionArchive.Service.SchedulerService/Dockerfile name:scheduler-service]) (pull_request) Has been cancelled
Release / build-and-push (map[dockerfile:FictionArchive.Service.TranslationService/Dockerfile name:translation-service]) (pull_request) Has been cancelled
Release / build-and-push (map[dockerfile:FictionArchive.Service.UserService/Dockerfile name:user-service]) (pull_request) Has been cancelled
Release / build-frontend (pull_request) Has been cancelled
2025-11-26 12:49:07 -05:00
gamer147
f25cbc1a04 [FA-11] Dumb
Some checks failed
Build Gateway / build-subgraphs (map[name:novel-service project:FictionArchive.Service.NovelService subgraph:Novel]) (pull_request) Failing after 42s
Build Gateway / build-subgraphs (map[name:scheduler-service project:FictionArchive.Service.SchedulerService subgraph:Scheduler]) (pull_request) Failing after 45s
Build Gateway / build-subgraphs (map[name:translation-service project:FictionArchive.Service.TranslationService subgraph:Translation]) (pull_request) Failing after 42s
Build Gateway / build-subgraphs (map[name:user-service project:FictionArchive.Service.UserService subgraph:User]) (pull_request) Failing after 40s
Build Gateway / build-gateway (pull_request) Has been skipped
CI / build-frontend (pull_request) Has been cancelled
CI / build-backend (pull_request) Has been cancelled
Release / build-and-push (map[dockerfile:FictionArchive.API/Dockerfile name:api]) (pull_request) Has been cancelled
Release / build-and-push (map[dockerfile:FictionArchive.Service.AuthenticationService/Dockerfile name:authentication-service]) (pull_request) Has been cancelled
Release / build-and-push (map[dockerfile:FictionArchive.Service.FileService/Dockerfile name:file-service]) (pull_request) Has been cancelled
Release / build-and-push (map[dockerfile:FictionArchive.Service.NovelService/Dockerfile name:novel-service]) (pull_request) Has been cancelled
Release / build-and-push (map[dockerfile:FictionArchive.Service.SchedulerService/Dockerfile name:scheduler-service]) (pull_request) Has been cancelled
Release / build-and-push (map[dockerfile:FictionArchive.Service.TranslationService/Dockerfile name:translation-service]) (pull_request) Has been cancelled
Release / build-and-push (map[dockerfile:FictionArchive.Service.UserService/Dockerfile name:user-service]) (pull_request) Has been cancelled
Release / build-frontend (pull_request) Has been cancelled
2025-11-26 12:44:51 -05:00
gamer147
078eaf5237 [FA-11] Dumb
Some checks failed
Build Gateway / build-subgraphs (map[name:novel-service project:FictionArchive.Service.NovelService subgraph:Novel]) (pull_request) Failing after 41s
Build Gateway / build-subgraphs (map[name:scheduler-service project:FictionArchive.Service.SchedulerService subgraph:Scheduler]) (pull_request) Failing after 39s
Build Gateway / build-subgraphs (map[name:user-service project:FictionArchive.Service.UserService subgraph:User]) (pull_request) Has been cancelled
Build Gateway / build-gateway (pull_request) Has been cancelled
Build Gateway / build-subgraphs (map[name:translation-service project:FictionArchive.Service.TranslationService subgraph:Translation]) (pull_request) Has been cancelled
CI / build-frontend (pull_request) Has been cancelled
CI / build-backend (pull_request) Has been cancelled
Release / build-and-push (map[dockerfile:FictionArchive.Service.AuthenticationService/Dockerfile name:authentication-service]) (pull_request) Has been cancelled
Release / build-and-push (map[dockerfile:FictionArchive.Service.FileService/Dockerfile name:file-service]) (pull_request) Has been cancelled
Release / build-and-push (map[dockerfile:FictionArchive.Service.NovelService/Dockerfile name:novel-service]) (pull_request) Has been cancelled
Release / build-and-push (map[dockerfile:FictionArchive.Service.SchedulerService/Dockerfile name:scheduler-service]) (pull_request) Has been cancelled
Release / build-and-push (map[dockerfile:FictionArchive.Service.TranslationService/Dockerfile name:translation-service]) (pull_request) Has been cancelled
Release / build-and-push (map[dockerfile:FictionArchive.Service.UserService/Dockerfile name:user-service]) (pull_request) Has been cancelled
Release / build-frontend (pull_request) Has been cancelled
Release / build-and-push (map[dockerfile:FictionArchive.API/Dockerfile name:api]) (pull_request) Has been cancelled
2025-11-26 12:42:35 -05:00
gamer147
b9115d78a9 [FA-11] I'm getting sick of fusion but I dont see better alternatives
Some checks failed
Release / build-and-push (map[dockerfile:FictionArchive.API/Dockerfile name:api]) (pull_request) Has been cancelled
Release / build-and-push (map[dockerfile:FictionArchive.Service.AuthenticationService/Dockerfile name:authentication-service]) (pull_request) Has been cancelled
Release / build-and-push (map[dockerfile:FictionArchive.Service.FileService/Dockerfile name:file-service]) (pull_request) Has been cancelled
Release / build-and-push (map[dockerfile:FictionArchive.Service.NovelService/Dockerfile name:novel-service]) (pull_request) Has been cancelled
Release / build-and-push (map[dockerfile:FictionArchive.Service.SchedulerService/Dockerfile name:scheduler-service]) (pull_request) Has been cancelled
Release / build-and-push (map[dockerfile:FictionArchive.Service.TranslationService/Dockerfile name:translation-service]) (pull_request) Has been cancelled
Release / build-and-push (map[dockerfile:FictionArchive.Service.UserService/Dockerfile name:user-service]) (pull_request) Has been cancelled
Release / build-frontend (pull_request) Has been cancelled
Build Gateway / build-subgraphs (map[name:novel-service project:FictionArchive.Service.NovelService subgraph:Novel]) (pull_request) Failing after 51s
Build Gateway / build-subgraphs (map[name:translation-service project:FictionArchive.Service.TranslationService subgraph:Translation]) (pull_request) Has been cancelled
Build Gateway / build-subgraphs (map[name:user-service project:FictionArchive.Service.UserService subgraph:User]) (pull_request) Has been cancelled
Build Gateway / build-gateway (pull_request) Has been cancelled
Build Gateway / build-subgraphs (map[name:scheduler-service project:FictionArchive.Service.SchedulerService subgraph:Scheduler]) (pull_request) Has been cancelled
CI / build-frontend (pull_request) Has been cancelled
CI / build-backend (pull_request) Has been cancelled
2025-11-26 12:40:22 -05:00
gamer147
7e94f06853 [FA-11] Remove FileService graphQL build
Some checks failed
Build Gateway / build-subgraphs (map[name:novel-service project:FictionArchive.Service.NovelService subgraph:Novel]) (pull_request) Failing after 42s
Build Gateway / build-subgraphs (map[name:scheduler-service project:FictionArchive.Service.SchedulerService subgraph:Scheduler]) (pull_request) Failing after 40s
Build Gateway / build-subgraphs (map[name:translation-service project:FictionArchive.Service.TranslationService subgraph:Translation]) (pull_request) Failing after 42s
Build Gateway / build-subgraphs (map[name:user-service project:FictionArchive.Service.UserService subgraph:User]) (pull_request) Failing after 40s
Build Gateway / build-gateway (pull_request) Has been skipped
CI / build-backend (pull_request) Successful in 55s
CI / build-frontend (pull_request) Successful in 27s
Release / build-and-push (map[dockerfile:FictionArchive.API/Dockerfile name:api]) (pull_request) Successful in 2m22s
Release / build-and-push (map[dockerfile:FictionArchive.Service.AuthenticationService/Dockerfile name:authentication-service]) (pull_request) Successful in 1m46s
Release / build-and-push (map[dockerfile:FictionArchive.Service.FileService/Dockerfile name:file-service]) (pull_request) Failing after 21s
Release / build-and-push (map[dockerfile:FictionArchive.Service.NovelService/Dockerfile name:novel-service]) (pull_request) Successful in 1m39s
Release / build-and-push (map[dockerfile:FictionArchive.Service.SchedulerService/Dockerfile name:scheduler-service]) (pull_request) Successful in 1m31s
Release / build-and-push (map[dockerfile:FictionArchive.Service.TranslationService/Dockerfile name:translation-service]) (pull_request) Successful in 1m34s
Release / build-and-push (map[dockerfile:FictionArchive.Service.UserService/Dockerfile name:user-service]) (pull_request) Successful in 1m27s
Release / build-frontend (pull_request) Successful in 51s
2025-11-26 11:57:18 -05:00
gamer147
50263109ab [FA-11] More pipeline fixes
Some checks failed
Build Gateway / build-subgraphs (map[name:novel-service project:FictionArchive.Service.NovelService subgraph:Novel]) (pull_request) Has been cancelled
Build Gateway / build-subgraphs (map[name:scheduler-service project:FictionArchive.Service.SchedulerService subgraph:Scheduler]) (pull_request) Has been cancelled
Build Gateway / build-subgraphs (map[name:translation-service project:FictionArchive.Service.TranslationService subgraph:Translation]) (pull_request) Has been cancelled
Build Gateway / build-subgraphs (map[name:user-service project:FictionArchive.Service.UserService subgraph:User]) (pull_request) Has been cancelled
Build Gateway / build-gateway (pull_request) Has been cancelled
Build Gateway / build-subgraphs (map[name:file-service project:FictionArchive.Service.FileService subgraph:File]) (pull_request) Has been cancelled
CI / build-frontend (pull_request) Has been cancelled
CI / build-backend (pull_request) Has been cancelled
Release / build-and-push (map[dockerfile:FictionArchive.Service.AuthenticationService/Dockerfile name:authentication-service]) (pull_request) Has been cancelled
Release / build-and-push (map[dockerfile:FictionArchive.Service.FileService/Dockerfile name:file-service]) (pull_request) Has been cancelled
Release / build-and-push (map[dockerfile:FictionArchive.Service.NovelService/Dockerfile name:novel-service]) (pull_request) Has been cancelled
Release / build-and-push (map[dockerfile:FictionArchive.Service.SchedulerService/Dockerfile name:scheduler-service]) (pull_request) Has been cancelled
Release / build-and-push (map[dockerfile:FictionArchive.Service.TranslationService/Dockerfile name:translation-service]) (pull_request) Has been cancelled
Release / build-and-push (map[dockerfile:FictionArchive.Service.UserService/Dockerfile name:user-service]) (pull_request) Has been cancelled
Release / build-frontend (pull_request) Has been cancelled
Release / build-and-push (map[dockerfile:FictionArchive.API/Dockerfile name:api]) (pull_request) Has been cancelled
2025-11-26 11:54:57 -05:00
gamer147
6ebfe81ae3 [FA-11] Test pipelines
Some checks failed
Build Gateway / build-gateway (pull_request) Has been cancelled
Build Subgraphs / build-subgraphs (map[name:novel-service project:FictionArchive.Service.NovelService subgraph:Novel]) (pull_request) Has been cancelled
Build Subgraphs / build-subgraphs (map[name:scheduler-service project:FictionArchive.Service.SchedulerService subgraph:Scheduler]) (pull_request) Has been cancelled
Build Subgraphs / build-subgraphs (map[name:translation-service project:FictionArchive.Service.TranslationService subgraph:Translation]) (pull_request) Has been cancelled
Build Subgraphs / build-subgraphs (map[name:user-service project:FictionArchive.Service.UserService subgraph:User]) (pull_request) Has been cancelled
Build Subgraphs / trigger-gateway (pull_request) Has been cancelled
Build Subgraphs / build-subgraphs (map[name:file-service project:FictionArchive.Service.FileService subgraph:File]) (pull_request) Has been cancelled
CI / build-backend (pull_request) Successful in 58s
CI / build-frontend (pull_request) Successful in 27s
Release / build-and-push (map[dockerfile:FictionArchive.API/Dockerfile name:api]) (pull_request) Failing after 1m37s
Release / build-and-push (map[dockerfile:FictionArchive.Service.AuthenticationService/Dockerfile name:authentication-service]) (pull_request) Failing after 19s
Release / build-and-push (map[dockerfile:FictionArchive.Service.FileService/Dockerfile name:file-service]) (pull_request) Failing after 20s
Release / build-and-push (map[dockerfile:FictionArchive.Service.NovelService/Dockerfile name:novel-service]) (pull_request) Failing after 20s
Release / build-and-push (map[dockerfile:FictionArchive.Service.SchedulerService/Dockerfile name:scheduler-service]) (pull_request) Failing after 19s
Release / build-and-push (map[dockerfile:FictionArchive.Service.TranslationService/Dockerfile name:translation-service]) (pull_request) Failing after 19s
Release / build-and-push (map[dockerfile:FictionArchive.Service.UserService/Dockerfile name:user-service]) (pull_request) Failing after 19s
Release / build-frontend (pull_request) Failing after 19s
2025-11-26 11:37:05 -05:00
gamer147
80aac63f7d Merge remote-tracking branch 'origin/feature/FA-11_CICD' into feature/FA-11_CICD
All checks were successful
CI / build-backend (pull_request) Successful in 1m29s
CI / build-frontend (pull_request) Successful in 1m7s
2025-11-26 11:25:57 -05:00
gamer147
adc99c7000 [FA-11] Updated manual dispatch 2025-11-26 11:25:52 -05:00
87075be61e Update .gitea/workflows/claude_assistant.yml 2025-11-26 16:17:03 +00:00
259dc08aea Update .gitea/workflows/claude_assistant.yml 2025-11-26 16:14:48 +00:00
2203d2ee54 Update .gitea/workflows/claude_assistant.yml 2025-11-26 15:54:49 +00:00
30cc89242d Merge branch 'master' into feature/FA-11_CICD
All checks were successful
CI / build-backend (pull_request) Successful in 1m1s
CI / build-frontend (pull_request) Successful in 25s
2025-11-26 15:37:47 +00:00
84294455f9 Update .gitea/workflows/claude_assistant.yml 2025-11-26 15:27:14 +00:00
be62af98d3 Update .gitea/workflows/claude_assistant.yml 2025-11-26 15:19:25 +00:00
gamer147
15a8185621 [FA-11] Fix react build issues
All checks were successful
CI / build-backend (pull_request) Successful in 1m7s
CI / build-frontend (pull_request) Successful in 26s
2025-11-26 08:48:00 -05:00
gamer147
0180a58084 [FA-11] Hopefully resolves build issues, although I don't know why the build_gateway was necessarily failing in build.yml and trying to access Debug bins
Some checks failed
CI / build-backend (pull_request) Successful in 56s
CI / build-frontend (pull_request) Failing after 23s
2025-11-26 07:26:57 -05:00
gamer147
573f3fc7b0 [FA-11] That causes an error so fingers crossed this time
Some checks failed
CI / build-backend (pull_request) Failing after 52s
CI / build-frontend (pull_request) Failing after 21s
2025-11-26 07:11:40 -05:00
gamer147
cdc2176e35 [FA-11] Try and disable the caching again, forgot a step like an idiot
Some checks failed
CI / build-backend (pull_request) Failing after 1m24s
CI / build-frontend (pull_request) Failing after 20s
2025-11-26 07:08:32 -05:00
gamer147
e9eaf1569b [FA-11] Disable Node caching all together and let backend rebuild if needed
Some checks failed
CI / build-backend (pull_request) Failing after 52s
CI / build-frontend (pull_request) Failing after 4m52s
2025-11-26 00:49:27 -05:00
gamer147
ba99642e97 [FA-11] Fix build errors, try to fix cache miss on node build
Some checks failed
CI / build-backend (pull_request) Failing after 1m11s
CI / build-frontend (pull_request) Has been cancelled
2025-11-26 00:40:07 -05:00
aff1396c6a Update .gitea/workflows/claude_assistant.yml 2025-11-26 05:03:53 +00:00
53 changed files with 2195 additions and 1444 deletions

View File

@@ -3,18 +3,31 @@ name: Build Gateway
on: on:
workflow_dispatch: workflow_dispatch:
push: push:
branches: tags:
- master - 'v*.*.*'
paths:
- 'FictionArchive.API/**'
env: env:
REGISTRY: ${{ gitea.server_url }} REGISTRY: ${{ gitea.server_url }}
IMAGE_NAME: ${{ gitea.repository_owner }}/fictionarchive-api IMAGE_NAME: ${{ gitea.repository_owner }}/fictionarchive-api
jobs: jobs:
build-gateway: build-subgraphs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
strategy:
matrix:
service:
- name: novel-service
project: FictionArchive.Service.NovelService
subgraph: Novel
- name: translation-service
project: FictionArchive.Service.TranslationService
subgraph: Translation
- name: scheduler-service
project: FictionArchive.Service.SchedulerService
subgraph: Scheduler
- name: user-service
project: FictionArchive.Service.UserService
subgraph: User
steps: steps:
- name: Checkout - name: Checkout
uses: actions/checkout@v4 uses: actions/checkout@v4
@@ -27,44 +40,75 @@ jobs:
- name: Install Fusion CLI - name: Install Fusion CLI
run: dotnet tool install -g HotChocolate.Fusion.CommandLine run: dotnet tool install -g HotChocolate.Fusion.CommandLine
- name: Add .NET tools to PATH
run: echo "$HOME/.dotnet/tools" >> $GITHUB_PATH
- name: Restore dependencies
run: dotnet restore ${{ matrix.service.project }}/${{ matrix.service.project }}.csproj
- name: Build
run: dotnet build ${{ matrix.service.project }}/${{ matrix.service.project }}.csproj -c Release --no-restore
- name: Export schema
run: |
dotnet run -c Release --no-launch-profile \
--project ${{ matrix.service.project }}/${{ matrix.service.project }}.csproj \
-- schema export --output schema.graphql
- name: Pack subgraph
run: fusion subgraph pack -w ${{ matrix.service.project }}
- name: Upload subgraph package
uses: christopherhx/gitea-upload-artifact@v4
with:
name: ${{ matrix.service.name }}-subgraph
path: ${{ matrix.service.project }}/*.fsp
retention-days: 30
build-gateway:
runs-on: ubuntu-latest
needs: build-subgraphs
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup .NET
uses: actions/setup-dotnet@v4
with:
dotnet-version: '8.0.x'
- name: Install Fusion CLI
run: dotnet tool install -g HotChocolate.Fusion.CommandLine
- name: Add .NET tools to PATH
run: echo "$HOME/.dotnet/tools" >> $GITHUB_PATH
- name: Create subgraphs directory - name: Create subgraphs directory
run: mkdir -p subgraphs run: mkdir -p subgraphs
# Download all subgraph packages from latest successful builds
- name: Download Novel Service subgraph - name: Download Novel Service subgraph
uses: actions/download-artifact@v4 uses: christopherhx/gitea-download-artifact@v4
with: with:
name: novel-service-subgraph name: novel-service-subgraph
path: subgraphs/novel path: subgraphs/novel
continue-on-error: true
- name: Download Translation Service subgraph - name: Download Translation Service subgraph
uses: actions/download-artifact@v4 uses: christopherhx/gitea-download-artifact@v4
with: with:
name: translation-service-subgraph name: translation-service-subgraph
path: subgraphs/translation path: subgraphs/translation
continue-on-error: true
- name: Download Scheduler Service subgraph - name: Download Scheduler Service subgraph
uses: actions/download-artifact@v4 uses: christopherhx/gitea-download-artifact@v4
with: with:
name: scheduler-service-subgraph name: scheduler-service-subgraph
path: subgraphs/scheduler path: subgraphs/scheduler
continue-on-error: true
- name: Download User Service subgraph - name: Download User Service subgraph
uses: actions/download-artifact@v4 uses: christopherhx/gitea-download-artifact@v4
with: with:
name: user-service-subgraph name: user-service-subgraph
path: subgraphs/user path: subgraphs/user
continue-on-error: true
- name: Download File Service subgraph
uses: actions/download-artifact@v4
with:
name: file-service-subgraph
path: subgraphs/file
continue-on-error: true
- name: Configure subgraph URLs for Docker - name: Configure subgraph URLs for Docker
run: | run: |
@@ -95,13 +139,13 @@ jobs:
- name: Build gateway - name: Build gateway
run: dotnet build FictionArchive.API/FictionArchive.API.csproj -c Release --no-restore -p:SkipFusionBuild=true run: dotnet build FictionArchive.API/FictionArchive.API.csproj -c Release --no-restore -p:SkipFusionBuild=true
- name: Run tests
run: dotnet test FictionArchive.sln -c Release --no-build --verbosity normal
continue-on-error: true
- name: Set up Docker Buildx - name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3 uses: docker/setup-buildx-action@v3
- name: Extract registry hostname
id: registry
run: echo "HOST=$(echo '${{ gitea.server_url }}' | sed 's|https\?://||')" >> $GITHUB_OUTPUT
- name: Log in to Gitea Container Registry - name: Log in to Gitea Container Registry
uses: docker/login-action@v3 uses: docker/login-action@v3
with: with:
@@ -116,7 +160,7 @@ jobs:
file: FictionArchive.API/Dockerfile file: FictionArchive.API/Dockerfile
push: true push: true
tags: | tags: |
${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:latest ${{ steps.registry.outputs.HOST }}/${{ env.IMAGE_NAME }}:latest
${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:${{ gitea.sha }} ${{ steps.registry.outputs.HOST }}/${{ env.IMAGE_NAME }}:${{ gitea.sha }}
cache-from: type=gha cache-from: type=gha
cache-to: type=gha,mode=max cache-to: type=gha,mode=max

View File

@@ -1,77 +0,0 @@
name: Build Subgraphs
on:
push:
branches:
- master
paths:
- 'FictionArchive.Service.*/**'
- 'FictionArchive.Common/**'
- 'FictionArchive.Service.Shared/**'
jobs:
build-subgraphs:
runs-on: ubuntu-latest
strategy:
matrix:
service:
- name: novel-service
project: FictionArchive.Service.NovelService
subgraph: Novel
- name: translation-service
project: FictionArchive.Service.TranslationService
subgraph: Translation
- name: scheduler-service
project: FictionArchive.Service.SchedulerService
subgraph: Scheduler
- name: user-service
project: FictionArchive.Service.UserService
subgraph: User
- name: file-service
project: FictionArchive.Service.FileService
subgraph: File
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup .NET
uses: actions/setup-dotnet@v4
with:
dotnet-version: '8.0.x'
- name: Install Fusion CLI
run: dotnet tool install -g HotChocolate.Fusion.CommandLine
- name: Restore dependencies
run: dotnet restore ${{ matrix.service.project }}/${{ matrix.service.project }}.csproj
- name: Build
run: dotnet build ${{ matrix.service.project }}/${{ matrix.service.project }}.csproj -c Release --no-restore
- name: Export schema
run: |
dotnet run --project ${{ matrix.service.project }}/${{ matrix.service.project }}.csproj \
--no-build -c Release --no-launch-profile \
-- schema export --output ${{ matrix.service.project }}/schema.graphql
- name: Pack subgraph
run: fusion subgraph pack -w ${{ matrix.service.project }}
- name: Upload subgraph package
uses: actions/upload-artifact@v4
with:
name: ${{ matrix.service.name }}-subgraph
path: ${{ matrix.service.project }}/*.fsp
retention-days: 30
# Trigger gateway build after all subgraphs are built
trigger-gateway:
runs-on: ubuntu-latest
needs: build-subgraphs
steps:
- name: Trigger gateway workflow
run: |
curl -X POST \
-H "Authorization: token ${{ secrets.GITEA_TOKEN }}" \
"${{ gitea.server_url }}/api/v1/repos/${{ gitea.repository }}/actions/workflows/build-gateway.yml/dispatches" \
-d '{"ref":"master"}'

View File

@@ -20,22 +20,34 @@ jobs:
with: with:
dotnet-version: '8.0.x' dotnet-version: '8.0.x'
- name: Setup Python
uses: actions/setup-python@v5
with:
python-version: '3.12'
- name: Install Fusion CLI
run: dotnet tool install -g HotChocolate.Fusion.CommandLine
- name: Restore dependencies - name: Restore dependencies
run: dotnet restore FictionArchive.sln run: dotnet restore FictionArchive.sln
- name: Build solution - name: Build solution
run: dotnet build FictionArchive.sln --configuration Release --no-restore run: dotnet build FictionArchive.sln --configuration Release --no-restore /p:SkipFusionBuild=true
- name: Run tests - name: Run tests
run: dotnet test FictionArchive.sln --configuration Release --no-build --verbosity normal run: |
dotnet test FictionArchive.sln --configuration Release --no-build --verbosity normal \
--logger "trx;LogFileName=test-results.trx" \
--collect:"XPlat Code Coverage" \
--results-directory ./TestResults
- name: Upload test results
uses: christopherhx/gitea-upload-artifact@v4
if: always()
with:
name: test-results
path: ./TestResults/**/*.trx
retention-days: 30
- name: Upload coverage results
uses: christopherhx/gitea-upload-artifact@v4
if: always()
with:
name: coverage-results
path: ./TestResults/**/coverage.cobertura.xml
retention-days: 30
build-frontend: build-frontend:
runs-on: ubuntu-latest runs-on: ubuntu-latest
@@ -47,11 +59,10 @@ jobs:
uses: actions/checkout@v4 uses: actions/checkout@v4
- name: Setup Node.js - name: Setup Node.js
uses: actions/setup-node@v4 uses: actions/setup-node@v6.0.0
with: with:
node-version: '20' node-version: '20'
cache: 'npm' package-manager-cache: false
cache-dependency-path: fictionarchive-web/package-lock.json
- name: Install dependencies - name: Install dependencies
run: npm ci run: npm ci

View File

@@ -1,43 +1,49 @@
name: Claude Assistant for Gitea name: Claude PR Assistant
on: on:
# Trigger on issue comments (works on both issues and pull requests in Gitea)
issue_comment: issue_comment:
types: [created] types: [created]
# Trigger on issues being opened or assigned pull_request_review_comment:
types: [created]
issues: issues:
types: [opened, assigned] types: [opened, assigned]
# Note: pull_request_review_comment has limited support in Gitea pull_request_review:
# Use issue_comment instead which covers PR comments types: [submitted]
jobs: jobs:
claude-assistant: claude-code-action:
# Basic trigger detection - check for @claude in comments or issue body
if: | if: |
(github.event_name == 'issue_comment' && contains(github.event.comment.body, '@claude')) || (github.event_name == 'issue_comment' && contains(github.event.comment.body, '@claude')) ||
(github.event_name == 'issues' && (contains(github.event.issue.body, '@claude') || github.event.action == 'assigned')) (github.event_name == 'pull_request_review_comment' && contains(github.event.comment.body, '@claude')) ||
(github.event_name == 'pull_request_review' && contains(github.event.review.body, '@claude')) ||
(github.event_name == 'issues' && contains(github.event.issue.body, '@claude'))
runs-on: ubuntu-latest runs-on: ubuntu-latest
permissions: permissions:
contents: write contents: write
pull-requests: write pull-requests: write
issues: write issues: write
# Note: Gitea Actions may not require id-token: write for basic functionality id-token: write
steps: steps:
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@v4 uses: actions/checkout@v4
with: with:
fetch-depth: 0 fetch-depth: 0
- name: Run Claude Assistant - name: Run Claude PR Action
uses: markwylde/claude-code-gitea-action uses: markwylde/claude-code-gitea-action@v1.0.20
with: with:
claude_code_oauth_token: ${{ secrets.CLAUDE_CODE_OAUTH_TOKEN }} claude_code_oauth_token: ${{ secrets.CLAUDE_CODE_OAUTH_TOKEN }}
gitea_token: ${{ secrets.CLAUDE_GITEA_TOKEN }} gitea_token: ${{ secrets.CLAUDE_GITEA_TOKEN }}
# Or use OAuth token instead:
# claude_code_oauth_token: ${{ secrets.CLAUDE_CODE_OAUTH_TOKEN }}
timeout_minutes: "60" timeout_minutes: "60"
trigger_phrase: "@claude" # mode: tag # Default: responds to @claude mentions
# Optional: Customize for Gitea environment # Optional: Restrict network access to specific domains only
custom_instructions: | # experimental_allowed_domains: |
You are working in a Gitea environment. Be aware that: # .anthropic.com
- Some GitHub Actions features may behave differently # .github.com
- Focus on core functionality and avoid advanced GitHub-specific features # api.github.com
- Use standard git operations when possible # .githubusercontent.com
# bun.sh
# registry.npmjs.org
# .blob.core.windows.net

View File

@@ -15,8 +15,6 @@ jobs:
strategy: strategy:
matrix: matrix:
service: service:
- name: api
dockerfile: FictionArchive.API/Dockerfile
- name: novel-service - name: novel-service
dockerfile: FictionArchive.Service.NovelService/Dockerfile dockerfile: FictionArchive.Service.NovelService/Dockerfile
- name: user-service - name: user-service
@@ -40,6 +38,10 @@ jobs:
id: version id: version
run: echo "VERSION=${GITHUB_REF_NAME#v}" >> $GITHUB_OUTPUT run: echo "VERSION=${GITHUB_REF_NAME#v}" >> $GITHUB_OUTPUT
- name: Extract registry hostname
id: registry
run: echo "HOST=$(echo '${{ gitea.server_url }}' | sed 's|https\?://||')" >> $GITHUB_OUTPUT
- name: Log in to Gitea Container Registry - name: Log in to Gitea Container Registry
uses: docker/login-action@v3 uses: docker/login-action@v3
with: with:
@@ -54,8 +56,8 @@ jobs:
file: ${{ matrix.service.dockerfile }} file: ${{ matrix.service.dockerfile }}
push: true push: true
tags: | tags: |
${{ env.REGISTRY }}/${{ env.IMAGE_PREFIX }}-${{ matrix.service.name }}:${{ steps.version.outputs.VERSION }} ${{ steps.registry.outputs.HOST }}/${{ env.IMAGE_PREFIX }}-${{ matrix.service.name }}:${{ steps.version.outputs.VERSION }}
${{ env.REGISTRY }}/${{ env.IMAGE_PREFIX }}-${{ matrix.service.name }}:latest ${{ steps.registry.outputs.HOST }}/${{ env.IMAGE_PREFIX }}-${{ matrix.service.name }}:latest
cache-from: type=gha cache-from: type=gha
cache-to: type=gha,mode=max cache-to: type=gha,mode=max
@@ -72,6 +74,10 @@ jobs:
id: version id: version
run: echo "VERSION=${GITHUB_REF_NAME#v}" >> $GITHUB_OUTPUT run: echo "VERSION=${GITHUB_REF_NAME#v}" >> $GITHUB_OUTPUT
- name: Extract registry hostname
id: registry
run: echo "HOST=$(echo '${{ gitea.server_url }}' | sed 's|https\?://||')" >> $GITHUB_OUTPUT
- name: Log in to Gitea Container Registry - name: Log in to Gitea Container Registry
uses: docker/login-action@v3 uses: docker/login-action@v3
with: with:
@@ -92,7 +98,7 @@ jobs:
VITE_OIDC_REDIRECT_URI=${{ vars.VITE_OIDC_REDIRECT_URI }} VITE_OIDC_REDIRECT_URI=${{ vars.VITE_OIDC_REDIRECT_URI }}
VITE_OIDC_POST_LOGOUT_REDIRECT_URI=${{ vars.VITE_OIDC_POST_LOGOUT_REDIRECT_URI }} VITE_OIDC_POST_LOGOUT_REDIRECT_URI=${{ vars.VITE_OIDC_POST_LOGOUT_REDIRECT_URI }}
tags: | tags: |
${{ env.REGISTRY }}/${{ env.IMAGE_PREFIX }}-frontend:${{ steps.version.outputs.VERSION }} ${{ steps.registry.outputs.HOST }}/${{ env.IMAGE_PREFIX }}-frontend:${{ steps.version.outputs.VERSION }}
${{ env.REGISTRY }}/${{ env.IMAGE_PREFIX }}-frontend:latest ${{ steps.registry.outputs.HOST }}/${{ env.IMAGE_PREFIX }}-frontend:latest
cache-from: type=gha cache-from: type=gha
cache-to: type=gha,mode=max cache-to: type=gha,mode=max

View File

@@ -7,9 +7,9 @@ This document describes the CI/CD pipeline configuration for FictionArchive usin
| Workflow | File | Trigger | Purpose | | Workflow | File | Trigger | Purpose |
|----------|------|---------|---------| |----------|------|---------|---------|
| CI | `build.yml` | Push/PR to master | Build and test all projects | | CI | `build.yml` | Push/PR to master | Build and test all projects |
| Build Subgraphs | `build-subgraphs.yml` | Push to master (service changes) | Build GraphQL subgraph packages | | Build Gateway | `build-gateway.yml` | Tag `v*.*.*` or manual | Build subgraphs, compose gateway, push API image |
| Build Gateway | `build-gateway.yml` | Manual or triggered by subgraphs | Compose gateway and build Docker image |
| Release | `release.yml` | Tag `v*.*.*` | Build and push all Docker images | | Release | `release.yml` | Tag `v*.*.*` | Build and push all Docker images |
| Claude PR Assistant | `claude_assistant.yml` | Issue/PR comments with @claude | AI-assisted code review and issue handling |
## Pipeline Architecture ## Pipeline Architecture
@@ -18,27 +18,32 @@ This document describes the CI/CD pipeline configuration for FictionArchive usin
│ Push to master │ │ Push to master │
└─────────────────────────────┬───────────────────────────────────────┘ └─────────────────────────────┬───────────────────────────────────────┘
┌───────────────┴───────────────┐
▼ ▼ ┌─────────────────────────┐
┌─────────────────────────┐ ┌─────────────────────────┐ │ build.yml │
build.yml│ build-subgraphs.yml (CI checks)
(CI checks - always) (if service changes) │ └─────────────────────────┘
└─────────────────────────┘ └────────────┬────────────┘
┌─────────────────────────┐
│ build-gateway.yml │
│ (compose & push API) │
└─────────────────────────┘
┌─────────────────────────────────────────────────────────────────────┐ ┌─────────────────────────────────────────────────────────────────────┐
│ Push tag v*.*.* │ │ Push tag v*.*.* │
└─────────────────────────────┬───────────────────────────────────────┘
┌───────────────┴───────────────┐
▼ ▼
┌─────────────────────────┐ ┌─────────────────────────┐
│ release.yml │ │ build-gateway.yml │
│ (build & push all │ │ (build subgraphs & │
│ backend + frontend) │ │ push API gateway) │
└─────────────────────────┘ └─────────────────────────┘
┌─────────────────────────────────────────────────────────────────────┐
│ Issue/PR comment containing @claude │
└─────────────────────────────┬───────────────────────────────────────┘ └─────────────────────────────┬───────────────────────────────────────┘
┌─────────────────────────┐ ┌─────────────────────────┐
release.yml claude_assistant.yml
│ (build & push all) │ (AI code assistance)
└─────────────────────────┘ └─────────────────────────┘
``` ```
@@ -51,14 +56,15 @@ Configure these in **Settings → Actions → Secrets**:
| Secret | Description | Required By | | Secret | Description | Required By |
|--------|-------------|-------------| |--------|-------------|-------------|
| `REGISTRY_TOKEN` | Gitea access token with `write:package` scope | `release.yml`, `build-gateway.yml` | | `REGISTRY_TOKEN` | Gitea access token with `write:package` scope | `release.yml`, `build-gateway.yml` |
| `GITEA_TOKEN` | Gitea access token for API calls | `build-subgraphs.yml` | | `CLAUDE_CODE_OAUTH_TOKEN` | Claude Code OAuth token | `claude_assistant.yml` |
| `CLAUDE_GITEA_TOKEN` | Gitea token for Claude assistant | `claude_assistant.yml` |
#### Creating Access Tokens #### Creating Access Tokens
1. Go to **Settings → Applications → Access Tokens** 1. Go to **Settings → Applications → Access Tokens**
2. Create a new token with the following scopes: 2. Create a new token with the following scopes:
- `write:package` - Push container images - `write:package` - Push container images
- `write:repository` - Trigger workflows via API - `write:repository` - For Claude assistant to push commits
3. Copy the token and add it as a repository secret 3. Copy the token and add it as a repository secret
### Repository Variables ### Repository Variables
@@ -85,42 +91,62 @@ Configure these in **Settings → Actions → Variables**:
**Requirements:** **Requirements:**
- .NET 8.0 SDK - .NET 8.0 SDK
- Python 3.12
- Node.js 20 - Node.js 20
- HotChocolate Fusion CLI
### Build Subgraphs (`build-subgraphs.yml`) **Steps (Backend):**
1. Checkout repository
2. Setup .NET 8.0
3. Restore dependencies
4. Build solution (Release, with `SkipFusionBuild=true`)
5. Run tests
**Trigger:** Push to `master` with changes in: **Steps (Frontend):**
- `FictionArchive.Service.*/**` 1. Checkout repository
- `FictionArchive.Common/**` 2. Setup Node.js 20
- `FictionArchive.Service.Shared/**` 3. Install dependencies (`npm ci`)
4. Run linter (`npm run lint`)
**Jobs:** 5. Build application (`npm run build`)
1. `build-subgraphs` - Matrix job building each service's `.fsp` package
2. `trigger-gateway` - Triggers gateway rebuild via API
**Subgraphs Built:**
- Novel Service
- Translation Service
- Scheduler Service
- User Service
- File Service
**Artifacts:** Each subgraph produces a `.fsp` file retained for 30 days.
### Build Gateway (`build-gateway.yml`) ### Build Gateway (`build-gateway.yml`)
**Trigger:** **Trigger:**
- Manual dispatch (`workflow_dispatch`) - Manual dispatch (`workflow_dispatch`)
- Push to `master` with changes in `FictionArchive.API/**` - Push tag matching `v*.*.*`
- Triggered by `build-subgraphs.yml` completion
**Process:** **Jobs:**
1. Downloads all subgraph `.fsp` artifacts
2. Configures Docker-internal URLs for each subgraph #### 1. `build-subgraphs` (Matrix Job)
3. Composes gateway schema using Fusion CLI Builds GraphQL subgraph packages for each service:
4. Builds and pushes API Docker image
| Service | Project | Subgraph Name |
|---------|---------|---------------|
| novel-service | FictionArchive.Service.NovelService | Novel |
| translation-service | FictionArchive.Service.TranslationService | Translation |
| scheduler-service | FictionArchive.Service.SchedulerService | Scheduler |
| user-service | FictionArchive.Service.UserService | User |
**Note:** File Service and Authentication Service are not subgraphs (no GraphQL schema).
**Steps:**
1. Checkout repository
2. Setup .NET 8.0
3. Install HotChocolate Fusion CLI
4. Restore and build service project
5. Export GraphQL schema (`schema export`)
6. Pack subgraph into `.fsp` file
7. Upload artifact (retained 30 days)
#### 2. `build-gateway` (Depends on `build-subgraphs`)
Composes the API gateway from subgraph packages.
**Steps:**
1. Checkout repository
2. Setup .NET 8.0 and Fusion CLI
3. Download all subgraph artifacts
4. Configure Docker-internal URLs (`http://{service}-service:8080/graphql`)
5. Compose gateway schema using Fusion CLI
6. Build gateway project
7. Build and push Docker image
**Image Tags:** **Image Tags:**
- `<registry>/<owner>/fictionarchive-api:latest` - `<registry>/<owner>/fictionarchive-api:latest`
@@ -131,23 +157,54 @@ Configure these in **Settings → Actions → Variables**:
**Trigger:** Push tag matching `v*.*.*` (e.g., `v1.0.0`) **Trigger:** Push tag matching `v*.*.*` (e.g., `v1.0.0`)
**Jobs:** **Jobs:**
1. `build-and-push` - Matrix job building all backend service images
2. `build-frontend` - Builds and pushes frontend image
**Services Built:** #### 1. `build-and-push` (Matrix Job)
- `fictionarchive-api` Builds and pushes all backend service images:
- `fictionarchive-novel-service`
- `fictionarchive-user-service` | Service | Dockerfile |
- `fictionarchive-translation-service` |---------|------------|
- `fictionarchive-file-service` | novel-service | FictionArchive.Service.NovelService/Dockerfile |
- `fictionarchive-scheduler-service` | user-service | FictionArchive.Service.UserService/Dockerfile |
- `fictionarchive-authentication-service` | translation-service | FictionArchive.Service.TranslationService/Dockerfile |
- `fictionarchive-frontend` | file-service | FictionArchive.Service.FileService/Dockerfile |
| scheduler-service | FictionArchive.Service.SchedulerService/Dockerfile |
| authentication-service | FictionArchive.Service.AuthenticationService/Dockerfile |
#### 2. `build-frontend`
Builds and pushes the frontend image with environment-specific build arguments.
**Build Args:**
- `VITE_GRAPHQL_URI`
- `VITE_OIDC_AUTHORITY`
- `VITE_OIDC_CLIENT_ID`
- `VITE_OIDC_REDIRECT_URI`
- `VITE_OIDC_POST_LOGOUT_REDIRECT_URI`
**Image Tags:** **Image Tags:**
- `<registry>/<owner>/fictionarchive-<service>:<version>` - `<registry>/<owner>/fictionarchive-<service>:<version>`
- `<registry>/<owner>/fictionarchive-<service>:latest` - `<registry>/<owner>/fictionarchive-<service>:latest`
### Claude PR Assistant (`claude_assistant.yml`)
**Trigger:** Comments or issues containing `@claude`:
- Issue comments
- Pull request review comments
- Pull request reviews
- New issues (opened or assigned)
**Permissions Required:**
- `contents: write`
- `pull-requests: write`
- `issues: write`
- `id-token: write`
**Usage:**
Mention `@claude` in any issue or PR comment to invoke the AI assistant for:
- Code review assistance
- Bug analysis
- Implementation suggestions
- Documentation help
## Container Registry ## Container Registry
Images are pushed to the Gitea Container Registry at: Images are pushed to the Gitea Container Registry at:
@@ -155,6 +212,19 @@ Images are pushed to the Gitea Container Registry at:
<gitea-server-url>/<repository-owner>/fictionarchive-<service>:<tag> <gitea-server-url>/<repository-owner>/fictionarchive-<service>:<tag>
``` ```
### Image Naming Convention
| Image | Description |
|-------|-------------|
| `fictionarchive-api` | API Gateway (GraphQL Federation) |
| `fictionarchive-novel-service` | Novel Service |
| `fictionarchive-user-service` | User Service |
| `fictionarchive-translation-service` | Translation Service |
| `fictionarchive-file-service` | File Service |
| `fictionarchive-scheduler-service` | Scheduler Service |
| `fictionarchive-authentication-service` | Authentication Service |
| `fictionarchive-frontend` | Web Frontend |
### Pulling Images ### Pulling Images
```bash ```bash
@@ -184,13 +254,13 @@ docker pull <gitea-server-url>/<owner>/fictionarchive-api:latest
- Ensure the `REGISTRY_TOKEN` secret is configured in repository settings - Ensure the `REGISTRY_TOKEN` secret is configured in repository settings
- Verify the token has `write:package` scope - Verify the token has `write:package` scope
**"Failed to trigger gateway workflow"**
- Ensure `GITEA_TOKEN` secret is configured
- Verify the token has `write:repository` scope
**"No subgraph artifacts found"** **"No subgraph artifacts found"**
- The gateway build requires subgraph artifacts from a previous `build-subgraphs` run - The gateway build requires subgraph artifacts from the `build-subgraphs` job
- Trigger `build-subgraphs.yml` manually or push a change to a service - If subgraph builds failed, check the matrix job logs for errors
**"Schema export failed"**
- Ensure the service project has a valid `subgraph-config.json`
- Check that the service starts correctly for schema export
### Frontend Build Failures ### Frontend Build Failures
@@ -204,6 +274,13 @@ docker pull <gitea-server-url>/<owner>/fictionarchive-api:latest
- Verify `REGISTRY_TOKEN` has correct permissions - Verify `REGISTRY_TOKEN` has correct permissions
- Check that the token hasn't expired - Check that the token hasn't expired
### Claude Assistant Failures
**"Claude assistant not responding"**
- Verify `CLAUDE_CODE_OAUTH_TOKEN` is configured
- Verify `CLAUDE_GITEA_TOKEN` is configured and has write permissions
- Check that the comment contains `@claude` mention
## Local Testing ## Local Testing
To test workflows locally before pushing: To test workflows locally before pushing:

View File

@@ -13,6 +13,7 @@
<PackageReference Include="HotChocolate.Data.EntityFramework" Version="15.1.11" /> <PackageReference Include="HotChocolate.Data.EntityFramework" Version="15.1.11" />
<PackageReference Include="HotChocolate.Fusion" Version="15.1.11" /> <PackageReference Include="HotChocolate.Fusion" Version="15.1.11" />
<PackageReference Include="HotChocolate.Types.Scalars" Version="15.1.11" /> <PackageReference Include="HotChocolate.Types.Scalars" Version="15.1.11" />
<PackageReference Include="Microsoft.AspNetCore.HeaderPropagation" Version="8.0.22" />
<PackageReference Include="Microsoft.EntityFrameworkCore.Design" Version="9.0.11"> <PackageReference Include="Microsoft.EntityFrameworkCore.Design" Version="9.0.11">
<PrivateAssets>all</PrivateAssets> <PrivateAssets>all</PrivateAssets>
<IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets> <IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
@@ -20,6 +21,7 @@
<PackageReference Include="Microsoft.EntityFrameworkCore.Relational" Version="9.0.11" /> <PackageReference Include="Microsoft.EntityFrameworkCore.Relational" Version="9.0.11" />
<PackageReference Include="Microsoft.VisualStudio.Web.CodeGeneration.Design" Version="8.0.7" /> <PackageReference Include="Microsoft.VisualStudio.Web.CodeGeneration.Design" Version="8.0.7" />
<PackageReference Include="Swashbuckle.AspNetCore" Version="6.6.2"/> <PackageReference Include="Swashbuckle.AspNetCore" Version="6.6.2"/>
<PackageReference Include="Microsoft.AspNetCore.Authentication.JwtBearer" Version="8.0.11" />
</ItemGroup> </ItemGroup>
<!-- Builds the Fusion graph file before building the application itself (skipped in CI) --> <!-- Builds the Fusion graph file before building the application itself (skipped in CI) -->

View File

@@ -12,7 +12,11 @@ public class Program
#region Fusion Gateway #region Fusion Gateway
builder.Services.AddHttpClient("Fusion"); builder.Services.AddHttpClient("Fusion")
.AddHeaderPropagation(opt =>
{
opt.Headers.Add("Authorization");
});
builder.Services builder.Services
.AddFusionGatewayServer() .AddFusionGatewayServer()
@@ -21,23 +25,29 @@ public class Program
#endregion #endregion
// Add authentication
builder.Services.AddOidcAuthentication(builder.Configuration);
builder.Services.AddCors(options => builder.Services.AddCors(options =>
{ {
options.AddPolicy("AllowAllOrigins", options.AddPolicy("AllowFictionArchiveOrigins",
builder => policyBuilder =>
{ {
builder.AllowAnyOrigin() policyBuilder.WithOrigins("https://fictionarchive.orfl.xyz", "http://localhost:5173")
.AllowAnyMethod() .AllowAnyMethod()
.AllowAnyHeader(); .AllowAnyHeader()
.AllowCredentials();
}); });
}); });
var app = builder.Build(); var app = builder.Build();
app.UseCors("AllowAllOrigins"); app.UseCors("AllowFictionArchiveOrigins");
app.MapHealthChecks("/healthz"); app.MapHealthChecks("/healthz");
app.UseHeaderPropagation();
app.MapGraphQL(); app.MapGraphQL();
app.RunWithGraphQLCommands(args); app.RunWithGraphQLCommands(args);

View File

@@ -5,5 +5,15 @@
"Microsoft.AspNetCore": "Warning" "Microsoft.AspNetCore": "Warning"
} }
}, },
"AllowedHosts": "*" "AllowedHosts": "*",
"OIDC": {
"Authority": "https://auth.orfl.xyz/application/o/fiction-archive/",
"ClientId": "fictionarchive-api",
"Audience": "fictionarchive-api",
"ValidIssuer": "https://auth.orfl.xyz/application/o/fiction-archive/",
"ValidateIssuer": true,
"ValidateAudience": true,
"ValidateLifetime": true,
"ValidateIssuerSigningKey": true
}
} }

View File

@@ -2,6 +2,7 @@ using System.Web;
using Amazon.S3; using Amazon.S3;
using Amazon.S3.Model; using Amazon.S3.Model;
using FictionArchive.Service.FileService.Models; using FictionArchive.Service.FileService.Models;
using Microsoft.AspNetCore.Authorization;
using Microsoft.AspNetCore.Http; using Microsoft.AspNetCore.Http;
using Microsoft.AspNetCore.Mvc; using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Options; using Microsoft.Extensions.Options;
@@ -10,6 +11,7 @@ namespace FictionArchive.Service.FileService.Controllers
{ {
[Route("api/{*path}")] [Route("api/{*path}")]
[ApiController] [ApiController]
[Authorize]
public class S3ProxyController : ControllerBase public class S3ProxyController : ControllerBase
{ {
private readonly AmazonS3Client _amazonS3Client; private readonly AmazonS3Client _amazonS3Client;

View File

@@ -7,17 +7,17 @@ EXPOSE 8081
FROM mcr.microsoft.com/dotnet/sdk:8.0 AS build FROM mcr.microsoft.com/dotnet/sdk:8.0 AS build
ARG BUILD_CONFIGURATION=Release ARG BUILD_CONFIGURATION=Release
WORKDIR /src WORKDIR /src
COPY ["FictionArchive.Service.ImageService/FictionArchive.Service.ImageService.csproj", "FictionArchive.Service.ImageService/"] COPY ["FictionArchive.Service.FileService/FictionArchive.Service.FileService.csproj", "FictionArchive.Service.FileService/"]
RUN dotnet restore "FictionArchive.Service.ImageService/FictionArchive.Service.ImageService.csproj" RUN dotnet restore "FictionArchive.Service.FileService/FictionArchive.Service.FileService.csproj"
COPY . . COPY . .
WORKDIR "/src/FictionArchive.Service.ImageService" WORKDIR "/src/FictionArchive.Service.FileService"
RUN dotnet build "./FictionArchive.Service.ImageService.csproj" -c $BUILD_CONFIGURATION -o /app/build RUN dotnet build "./FictionArchive.Service.FileService.csproj" -c $BUILD_CONFIGURATION -o /app/build
FROM build AS publish FROM build AS publish
ARG BUILD_CONFIGURATION=Release ARG BUILD_CONFIGURATION=Release
RUN dotnet publish "./FictionArchive.Service.ImageService.csproj" -c $BUILD_CONFIGURATION -o /app/publish /p:UseAppHost=false RUN dotnet publish "./FictionArchive.Service.FileService.csproj" -c $BUILD_CONFIGURATION -o /app/publish /p:UseAppHost=false
FROM base AS final FROM base AS final
WORKDIR /app WORKDIR /app
COPY --from=publish /app/publish . COPY --from=publish /app/publish .
ENTRYPOINT ["dotnet", "FictionArchive.Service.ImageService.dll"] ENTRYPOINT ["dotnet", "FictionArchive.Service.FileService.dll"]

View File

@@ -21,6 +21,7 @@
<PackageReference Include="AWSSDK.S3" Version="4.0.13.1" /> <PackageReference Include="AWSSDK.S3" Version="4.0.13.1" />
<PackageReference Include="Microsoft.VisualStudio.Web.CodeGeneration.Design" Version="9.0.0" /> <PackageReference Include="Microsoft.VisualStudio.Web.CodeGeneration.Design" Version="9.0.0" />
<PackageReference Include="Swashbuckle.AspNetCore" Version="10.0.1" /> <PackageReference Include="Swashbuckle.AspNetCore" Version="10.0.1" />
<PackageReference Include="Microsoft.AspNetCore.Authentication.JwtBearer" Version="8.0.11" />
</ItemGroup> </ItemGroup>
<ItemGroup> <ItemGroup>

View File

@@ -34,6 +34,10 @@ public class Program
#endregion #endregion
// Add authentication with cookie support
builder.Services.AddOidcCookieAuthentication(builder.Configuration);
builder.Services.AddFictionArchiveAuthorization();
builder.Services.Configure<ProxyConfiguration>(builder.Configuration.GetSection("ProxyConfiguration")); builder.Services.Configure<ProxyConfiguration>(builder.Configuration.GetSection("ProxyConfiguration"));
// Add S3 Client // Add S3 Client
@@ -60,6 +64,9 @@ public class Program
app.UseSwaggerUI(); app.UseSwaggerUI();
} }
app.UseAuthentication();
app.UseAuthorization();
app.MapHealthChecks("/healthz"); app.MapHealthChecks("/healthz");
app.MapControllers(); app.MapControllers();

View File

@@ -18,5 +18,15 @@
"AccessKey": "REPLACE_ME", "AccessKey": "REPLACE_ME",
"SecretKey": "REPLACE_ME" "SecretKey": "REPLACE_ME"
}, },
"OIDC": {
"Authority": "https://auth.orfl.xyz/application/o/fiction-archive/",
"ClientId": "fictionarchive-files",
"Audience": "fictionarchive-api",
"ValidIssuer": "https://auth.orfl.xyz/application/o/fiction-archive/",
"ValidateIssuer": true,
"ValidateAudience": true,
"ValidateLifetime": true,
"ValidateIssuerSigningKey": true
},
"AllowedHosts": "*" "AllowedHosts": "*"
} }

View File

@@ -6,32 +6,24 @@ using FictionArchive.Service.NovelService.Models.SourceAdapters;
using FictionArchive.Service.NovelService.Services; using FictionArchive.Service.NovelService.Services;
using FictionArchive.Service.NovelService.Services.SourceAdapters; using FictionArchive.Service.NovelService.Services.SourceAdapters;
using FictionArchive.Service.Shared.Services.EventBus; using FictionArchive.Service.Shared.Services.EventBus;
using HotChocolate.Authorization;
using Microsoft.EntityFrameworkCore; using Microsoft.EntityFrameworkCore;
namespace FictionArchive.Service.NovelService.GraphQL; namespace FictionArchive.Service.NovelService.GraphQL;
public class Mutation public class Mutation
{ {
public async Task<NovelUpdateRequestedEvent> ImportNovel(string novelUrl, IEventBus eventBus) [Authorize]
public async Task<NovelUpdateRequestedEvent> ImportNovel(string novelUrl, NovelUpdateService service)
{ {
var importNovelRequestEvent = new NovelUpdateRequestedEvent() return await service.QueueNovelImport(novelUrl);
{
NovelUrl = novelUrl
};
await eventBus.Publish(importNovelRequestEvent);
return importNovelRequestEvent;
} }
[Authorize]
public async Task<ChapterPullRequestedEvent> FetchChapterContents(uint novelId, public async Task<ChapterPullRequestedEvent> FetchChapterContents(uint novelId,
uint chapterNumber, uint chapterNumber,
IEventBus eventBus) NovelUpdateService service)
{ {
var chapterPullEvent = new ChapterPullRequestedEvent() return await service.QueueChapterPull(novelId, chapterNumber);
{
NovelId = novelId,
ChapterNumber = chapterNumber
};
await eventBus.Publish(chapterPullEvent);
return chapterPullEvent;
} }
} }

View File

@@ -1,5 +1,6 @@
using FictionArchive.Service.NovelService.Models.Novels; using FictionArchive.Service.NovelService.Models.Novels;
using FictionArchive.Service.NovelService.Services; using FictionArchive.Service.NovelService.Services;
using HotChocolate.Authorization;
using HotChocolate.Data; using HotChocolate.Data;
using HotChocolate.Types; using HotChocolate.Types;
@@ -7,6 +8,7 @@ namespace FictionArchive.Service.NovelService.GraphQL;
public class Query public class Query
{ {
[Authorize]
[UsePaging] [UsePaging]
[UseProjection] [UseProjection]
[UseFiltering] [UseFiltering]

View File

@@ -6,6 +6,7 @@ using FictionArchive.Service.NovelService.Services;
using FictionArchive.Service.NovelService.Services.EventHandlers; using FictionArchive.Service.NovelService.Services.EventHandlers;
using FictionArchive.Service.NovelService.Services.SourceAdapters; using FictionArchive.Service.NovelService.Services.SourceAdapters;
using FictionArchive.Service.NovelService.Services.SourceAdapters.Novelpia; using FictionArchive.Service.NovelService.Services.SourceAdapters.Novelpia;
using FictionArchive.Service.Shared;
using FictionArchive.Service.Shared.Extensions; using FictionArchive.Service.Shared.Extensions;
using FictionArchive.Service.Shared.Services.EventBus.Implementations; using FictionArchive.Service.Shared.Services.EventBus.Implementations;
using FictionArchive.Service.Shared.Services.GraphQL; using FictionArchive.Service.Shared.Services.GraphQL;
@@ -17,6 +18,8 @@ public class Program
{ {
public static void Main(string[] args) public static void Main(string[] args)
{ {
var isSchemaExport = SchemaExportDetector.IsSchemaExportMode(args);
var builder = WebApplication.CreateBuilder(args); var builder = WebApplication.CreateBuilder(args);
builder.AddLocalAppsettings(); builder.AddLocalAppsettings();
@@ -24,26 +27,32 @@ public class Program
#region Event Bus #region Event Bus
builder.Services.AddRabbitMQ(opt => if (!isSchemaExport)
{ {
builder.Configuration.GetSection("RabbitMQ").Bind(opt); builder.Services.AddRabbitMQ(opt =>
}) {
.Subscribe<TranslationRequestCompletedEvent, TranslationRequestCompletedEventHandler>() builder.Configuration.GetSection("RabbitMQ").Bind(opt);
.Subscribe<NovelUpdateRequestedEvent, NovelUpdateRequestedEventHandler>() })
.Subscribe<ChapterPullRequestedEvent, ChapterPullRequestedEventHandler>() .Subscribe<TranslationRequestCompletedEvent, TranslationRequestCompletedEventHandler>()
.Subscribe<FileUploadRequestStatusUpdateEvent, FileUploadRequestStatusUpdateEventHandler>(); .Subscribe<NovelUpdateRequestedEvent, NovelUpdateRequestedEventHandler>()
.Subscribe<ChapterPullRequestedEvent, ChapterPullRequestedEventHandler>()
.Subscribe<FileUploadRequestStatusUpdateEvent, FileUploadRequestStatusUpdateEventHandler>();
}
#endregion #endregion
#region GraphQL #region GraphQL
builder.Services.AddDefaultGraphQl<Query, Mutation>(); builder.Services.AddDefaultGraphQl<Query, Mutation>()
.AddAuthorization();
#endregion #endregion
#region Database #region Database
builder.Services.RegisterDbContext<NovelServiceDbContext>(builder.Configuration.GetConnectionString("DefaultConnection")); builder.Services.RegisterDbContext<NovelServiceDbContext>(
builder.Configuration.GetConnectionString("DefaultConnection"),
skipInfrastructure: isSchemaExport);
#endregion #endregion
@@ -67,11 +76,16 @@ public class Program
builder.Services.AddHealthChecks(); builder.Services.AddHealthChecks();
// Authentication & Authorization
builder.Services.AddOidcAuthentication(builder.Configuration);
builder.Services.AddFictionArchiveAuthorization();
var app = builder.Build(); var app = builder.Build();
// Update database // Update database (skip in schema export mode)
using (var scope = app.Services.CreateScope()) if (!isSchemaExport)
{ {
using var scope = app.Services.CreateScope();
var dbContext = scope.ServiceProvider.GetRequiredService<NovelServiceDbContext>(); var dbContext = scope.ServiceProvider.GetRequiredService<NovelServiceDbContext>();
dbContext.UpdateDatabase(); dbContext.UpdateDatabase();
} }
@@ -80,6 +94,9 @@ public class Program
app.MapHealthChecks("/healthz"); app.MapHealthChecks("/healthz");
app.UseAuthentication();
app.UseAuthorization();
app.MapGraphQL(); app.MapGraphQL();
app.RunWithGraphQLCommands(args); app.RunWithGraphQLCommands(args);

View File

@@ -2,6 +2,7 @@ using FictionArchive.Service.FileService.IntegrationEvents;
using FictionArchive.Service.NovelService.Models.Configuration; using FictionArchive.Service.NovelService.Models.Configuration;
using FictionArchive.Service.NovelService.Models.Enums; using FictionArchive.Service.NovelService.Models.Enums;
using FictionArchive.Service.NovelService.Models.Images; using FictionArchive.Service.NovelService.Models.Images;
using FictionArchive.Service.NovelService.Models.IntegrationEvents;
using FictionArchive.Service.NovelService.Models.Localization; using FictionArchive.Service.NovelService.Models.Localization;
using FictionArchive.Service.NovelService.Models.Novels; using FictionArchive.Service.NovelService.Models.Novels;
using FictionArchive.Service.NovelService.Models.SourceAdapters; using FictionArchive.Service.NovelService.Models.SourceAdapters;
@@ -201,4 +202,25 @@ public class NovelUpdateService
await _dbContext.SaveChangesAsync(); await _dbContext.SaveChangesAsync();
} }
public async Task<NovelUpdateRequestedEvent> QueueNovelImport(string novelUrl)
{
var importNovelRequestEvent = new NovelUpdateRequestedEvent()
{
NovelUrl = novelUrl
};
await _eventBus.Publish(importNovelRequestEvent);
return importNovelRequestEvent;
}
public async Task<ChapterPullRequestedEvent> QueueChapterPull(uint novelId, uint chapterNumber)
{
var chapterPullEvent = new ChapterPullRequestedEvent()
{
NovelId = novelId,
ChapterNumber = chapterNumber
};
await _eventBus.Publish(chapterPullEvent);
return chapterPullEvent;
}
} }

View File

@@ -19,5 +19,15 @@
"ConnectionString": "amqp://localhost", "ConnectionString": "amqp://localhost",
"ClientIdentifier": "NovelService" "ClientIdentifier": "NovelService"
}, },
"AllowedHosts": "*" "AllowedHosts": "*",
"OIDC": {
"Authority": "https://auth.orfl.xyz/application/o/fiction-archive/",
"ClientId": "ldi5IpEidq2WW0Ka1lehVskb2SOBjnYRaZCpEyBh",
"Audience": "ldi5IpEidq2WW0Ka1lehVskb2SOBjnYRaZCpEyBh",
"ValidIssuer": "https://auth.orfl.xyz/application/o/fiction-archive/",
"ValidateIssuer": true,
"ValidateAudience": true,
"ValidateLifetime": true,
"ValidateIssuerSigningKey": true
}
} }

View File

@@ -1,6 +1,6 @@
{ {
"subgraph": "Novels", "subgraph": "Novels",
"http": { "http": {
"baseAddress": "http://localhost:5101/graphql" "baseAddress": "https://localhost:7208/graphql"
} }
} }

View File

@@ -1,6 +1,8 @@
using System.Data; using System.Data;
using FictionArchive.Service.SchedulerService.Models; using FictionArchive.Service.SchedulerService.Models;
using FictionArchive.Service.SchedulerService.Services; using FictionArchive.Service.SchedulerService.Services;
using FictionArchive.Service.Shared.Constants;
using HotChocolate.Authorization;
using HotChocolate.Types; using HotChocolate.Types;
using Quartz; using Quartz;
@@ -10,18 +12,21 @@ public class Mutation
{ {
[Error<DuplicateNameException>] [Error<DuplicateNameException>]
[Error<FormatException>] [Error<FormatException>]
[Authorize(Roles = [AuthorizationConstants.Roles.Admin])]
public async Task<SchedulerJob> ScheduleEventJob(string key, string description, string eventType, string eventData, string cronSchedule, JobManagerService jobManager) public async Task<SchedulerJob> ScheduleEventJob(string key, string description, string eventType, string eventData, string cronSchedule, JobManagerService jobManager)
{ {
return await jobManager.ScheduleEventJob(key, description, eventType, eventData, cronSchedule); return await jobManager.ScheduleEventJob(key, description, eventType, eventData, cronSchedule);
} }
[Error<JobPersistenceException>] [Error<JobPersistenceException>]
[Authorize(Roles = [AuthorizationConstants.Roles.Admin])]
public async Task<bool> RunJob(string jobKey, JobManagerService jobManager) public async Task<bool> RunJob(string jobKey, JobManagerService jobManager)
{ {
return await jobManager.TriggerJob(jobKey); return await jobManager.TriggerJob(jobKey);
} }
[Error<KeyNotFoundException>] [Error<KeyNotFoundException>]
[Authorize(Roles = [AuthorizationConstants.Roles.Admin])]
public async Task<bool> DeleteJob(string jobKey, JobManagerService jobManager) public async Task<bool> DeleteJob(string jobKey, JobManagerService jobManager)
{ {
bool deleted = await jobManager.DeleteJob(jobKey); bool deleted = await jobManager.DeleteJob(jobKey);

View File

@@ -1,5 +1,6 @@
using FictionArchive.Service.SchedulerService.GraphQL; using FictionArchive.Service.SchedulerService.GraphQL;
using FictionArchive.Service.SchedulerService.Services; using FictionArchive.Service.SchedulerService.Services;
using FictionArchive.Service.Shared;
using FictionArchive.Service.Shared.Extensions; using FictionArchive.Service.Shared.Extensions;
using FictionArchive.Service.Shared.Services.EventBus.Implementations; using FictionArchive.Service.Shared.Services.EventBus.Implementations;
using Quartz; using Quartz;
@@ -11,54 +12,79 @@ public class Program
{ {
public static void Main(string[] args) public static void Main(string[] args)
{ {
var isSchemaExport = SchemaExportDetector.IsSchemaExportMode(args);
var builder = WebApplication.CreateBuilder(args); var builder = WebApplication.CreateBuilder(args);
// Services // Services
builder.Services.AddDefaultGraphQl<Query, Mutation>(); builder.Services.AddDefaultGraphQl<Query, Mutation>()
.AddAuthorization();
builder.Services.AddHealthChecks(); builder.Services.AddHealthChecks();
builder.Services.AddTransient<JobManagerService>(); builder.Services.AddTransient<JobManagerService>();
// Authentication & Authorization
builder.Services.AddOidcAuthentication(builder.Configuration);
builder.Services.AddFictionArchiveAuthorization();
#region Database #region Database
builder.Services.RegisterDbContext<SchedulerServiceDbContext>(builder.Configuration.GetConnectionString("DefaultConnection")); builder.Services.RegisterDbContext<SchedulerServiceDbContext>(
builder.Configuration.GetConnectionString("DefaultConnection"),
skipInfrastructure: isSchemaExport);
#endregion #endregion
#region Event Bus #region Event Bus
builder.Services.AddRabbitMQ(opt => if (!isSchemaExport)
{ {
builder.Configuration.GetSection("RabbitMQ").Bind(opt); builder.Services.AddRabbitMQ(opt =>
}); {
builder.Configuration.GetSection("RabbitMQ").Bind(opt);
});
}
#endregion #endregion
#region Quartz #region Quartz
builder.Services.AddQuartz(opt => if (isSchemaExport)
{ {
opt.UsePersistentStore(pso => // Schema export mode: use in-memory store (no DB connection needed)
builder.Services.AddQuartz(opt =>
{ {
pso.UsePostgres(pgsql => opt.UseInMemoryStore();
{
pgsql.ConnectionString = builder.Configuration.GetConnectionString("DefaultConnection");
pgsql.UseDriverDelegate<PostgreSQLDelegate>();
pgsql.TablePrefix = "quartz.qrtz_"; // Needed for Postgres due to the differing schema used
});
pso.UseNewtonsoftJsonSerializer();
}); });
}); }
builder.Services.AddQuartzHostedService(opt => else
{ {
opt.WaitForJobsToComplete = true; builder.Services.AddQuartz(opt =>
}); {
opt.UsePersistentStore(pso =>
{
pso.UsePostgres(pgsql =>
{
pgsql.ConnectionString = builder.Configuration.GetConnectionString("DefaultConnection");
pgsql.UseDriverDelegate<PostgreSQLDelegate>();
pgsql.TablePrefix = "quartz.qrtz_"; // Needed for Postgres due to the differing schema used
});
pso.UseNewtonsoftJsonSerializer();
});
});
builder.Services.AddQuartzHostedService(opt =>
{
opt.WaitForJobsToComplete = true;
});
}
#endregion #endregion
var app = builder.Build(); var app = builder.Build();
using (var scope = app.Services.CreateScope()) // Update database (skip in schema export mode)
if (!isSchemaExport)
{ {
using var scope = app.Services.CreateScope();
var dbContext = scope.ServiceProvider.GetRequiredService<SchedulerServiceDbContext>(); var dbContext = scope.ServiceProvider.GetRequiredService<SchedulerServiceDbContext>();
dbContext.UpdateDatabase(); dbContext.UpdateDatabase();
} }
@@ -67,6 +93,9 @@ public class Program
app.MapHealthChecks("/healthz"); app.MapHealthChecks("/healthz");
app.UseAuthentication();
app.UseAuthorization();
app.MapGraphQL(); app.MapGraphQL();
app.RunWithGraphQLCommands(args); app.RunWithGraphQLCommands(args);

View File

@@ -12,5 +12,15 @@
"ConnectionStrings": { "ConnectionStrings": {
"DefaultConnection": "Host=localhost;Database=FictionArchive_SchedulerService;Username=postgres;password=postgres" "DefaultConnection": "Host=localhost;Database=FictionArchive_SchedulerService;Username=postgres;password=postgres"
}, },
"AllowedHosts": "*" "AllowedHosts": "*",
"OIDC": {
"Authority": "https://auth.orfl.xyz/application/o/fiction-archive/",
"ClientId": "fictionarchive-api",
"Audience": "fictionarchive-api",
"ValidIssuer": "https://auth.orfl.xyz/application/o/fiction-archive/",
"ValidateIssuer": true,
"ValidateAudience": true,
"ValidateLifetime": true,
"ValidateIssuerSigningKey": true
}
} }

View File

@@ -0,0 +1,15 @@
namespace FictionArchive.Service.Shared.Constants;
public static class AuthorizationConstants
{
public static class Roles
{
public const string Admin = "admin";
}
public static class Policies
{
public const string Admin = "Admin";
public const string User = "User";
}
}

View File

@@ -0,0 +1,168 @@
using Microsoft.AspNetCore.Authentication.JwtBearer;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.Logging;
using Microsoft.IdentityModel.Tokens;
using FictionArchive.Service.Shared.Constants;
using FictionArchive.Service.Shared.Models.Authentication;
using System.Linq;
namespace FictionArchive.Service.Shared.Extensions;
public static class AuthenticationExtensions
{
public static IServiceCollection AddOidcAuthentication(this IServiceCollection services, IConfiguration configuration)
{
var oidcConfig = configuration.GetSection("OIDC").Get<OidcConfiguration>();
if (oidcConfig == null)
{
throw new InvalidOperationException("OIDC configuration is required but not found in app settings");
}
ValidateOidcConfiguration(oidcConfig);
services.AddAuthentication(JwtBearerDefaults.AuthenticationScheme)
.AddJwtBearer(options =>
{
options.Authority = oidcConfig.Authority;
options.Audience = oidcConfig.Audience;
options.RequireHttpsMetadata = !string.IsNullOrEmpty(oidcConfig.Authority) && oidcConfig.Authority.StartsWith("https://");
options.TokenValidationParameters = new TokenValidationParameters
{
ValidateIssuer = oidcConfig.ValidateIssuer,
ValidIssuer = oidcConfig.ValidIssuer,
ValidateAudience = oidcConfig.ValidateAudience,
ValidateLifetime = oidcConfig.ValidateLifetime,
ValidateIssuerSigningKey = oidcConfig.ValidateIssuerSigningKey,
ClockSkew = TimeSpan.FromMinutes(5)
};
options.Events = CreateLoggingJwtBearerEvents();
});
return services;
}
private static JwtBearerEvents CreateLoggingJwtBearerEvents(JwtBearerEvents? existingEvents = null)
{
return new JwtBearerEvents
{
OnMessageReceived = existingEvents?.OnMessageReceived ?? (_ => Task.CompletedTask),
OnAuthenticationFailed = context =>
{
var logger = context.HttpContext.RequestServices.GetRequiredService<ILoggerFactory>()
.CreateLogger("JwtBearerAuthentication");
logger.LogWarning(context.Exception, "JWT authentication failed: {Message}", context.Exception.Message);
return existingEvents?.OnAuthenticationFailed?.Invoke(context) ?? Task.CompletedTask;
},
OnChallenge = context =>
{
var logger = context.HttpContext.RequestServices.GetRequiredService<ILoggerFactory>()
.CreateLogger("JwtBearerAuthentication");
logger.LogDebug(
"JWT challenge issued. Error: {Error}, ErrorDescription: {ErrorDescription}",
context.Error,
context.ErrorDescription);
return existingEvents?.OnChallenge?.Invoke(context) ?? Task.CompletedTask;
},
OnTokenValidated = context =>
{
var logger = context.HttpContext.RequestServices.GetRequiredService<ILoggerFactory>()
.CreateLogger("JwtBearerAuthentication");
logger.LogDebug(
"JWT token validated for subject: {Subject}",
context.Principal?.FindFirst("sub")?.Value ?? "unknown");
return existingEvents?.OnTokenValidated?.Invoke(context) ?? Task.CompletedTask;
}
};
}
public static IServiceCollection AddOidcCookieAuthentication(this IServiceCollection services, IConfiguration configuration, string cookieName = "fa_session")
{
var oidcConfig = configuration.GetSection("OIDC").Get<OidcConfiguration>();
if (oidcConfig == null)
{
throw new InvalidOperationException("OIDC configuration is required but not found in app settings");
}
ValidateOidcConfiguration(oidcConfig);
services.AddAuthentication(JwtBearerDefaults.AuthenticationScheme)
.AddJwtBearer(options =>
{
options.Authority = oidcConfig.Authority;
options.Audience = oidcConfig.Audience;
options.RequireHttpsMetadata = !string.IsNullOrEmpty(oidcConfig.Authority) && oidcConfig.Authority.StartsWith("https://");
var cookieEvents = new JwtBearerEvents
{
OnMessageReceived = context =>
{
// Try to get token from cookie first, then from Authorization header
if (context.Request.Cookies.ContainsKey(cookieName))
{
context.Token = context.Request.Cookies[cookieName];
}
return Task.CompletedTask;
}
};
options.Events = CreateLoggingJwtBearerEvents(cookieEvents);
options.TokenValidationParameters = new TokenValidationParameters
{
ValidateIssuer = oidcConfig.ValidateIssuer,
ValidIssuer = oidcConfig.ValidIssuer,
ValidateAudience = oidcConfig.ValidateAudience,
ValidateLifetime = oidcConfig.ValidateLifetime,
ValidateIssuerSigningKey = oidcConfig.ValidateIssuerSigningKey,
ClockSkew = TimeSpan.FromMinutes(5)
};
});
return services;
}
public static IServiceCollection AddFictionArchiveAuthorization(this IServiceCollection services)
{
services.AddAuthorizationBuilder()
.AddPolicy(AuthorizationConstants.Policies.Admin, policy => policy.RequireRole(AuthorizationConstants.Roles.Admin))
.AddPolicy(AuthorizationConstants.Policies.User, policy => policy.RequireAuthenticatedUser());
return services;
}
private static void ValidateOidcConfiguration(OidcConfiguration config)
{
var errors = new List<string>();
if (string.IsNullOrWhiteSpace(config.Authority))
errors.Add("OIDC Authority is required");
if (string.IsNullOrWhiteSpace(config.ClientId))
errors.Add("OIDC ClientId is required");
if (string.IsNullOrWhiteSpace(config.Audience))
errors.Add("OIDC Audience is required");
if (!Uri.TryCreate(config.Authority, UriKind.Absolute, out var authorityUri))
errors.Add($"OIDC Authority '{config.Authority}' is not a valid URI");
else if (!authorityUri.Scheme.Equals("https", StringComparison.OrdinalIgnoreCase) &&
!authorityUri.Host.Equals("localhost", StringComparison.OrdinalIgnoreCase))
errors.Add("OIDC Authority must use HTTPS unless running on localhost");
if (errors.Any())
{
throw new InvalidOperationException($"OIDC configuration validation failed:{Environment.NewLine}{string.Join(Environment.NewLine, errors)}");
}
}
}

View File

@@ -6,16 +6,29 @@ namespace FictionArchive.Service.Shared.Extensions;
public static class DatabaseExtensions public static class DatabaseExtensions
{ {
public static IServiceCollection RegisterDbContext<TContext>(this IServiceCollection services, public static IServiceCollection RegisterDbContext<TContext>(
string connectionString) where TContext : FictionArchiveDbContext this IServiceCollection services,
string connectionString,
bool skipInfrastructure = false) where TContext : FictionArchiveDbContext
{ {
services.AddDbContext<TContext>(options => if (skipInfrastructure)
{ {
options.UseNpgsql(connectionString, o => // For schema export: use in-memory provider to allow EF Core entity discovery
services.AddDbContext<TContext>(options =>
{ {
o.UseNodaTime(); options.UseInMemoryDatabase($"SchemaExport_{typeof(TContext).Name}");
}); });
}); }
else
{
services.AddDbContext<TContext>(options =>
{
options.UseNpgsql(connectionString, o =>
{
o.UseNodaTime();
});
});
}
return services; return services;
} }
} }

View File

@@ -9,6 +9,7 @@
<ItemGroup> <ItemGroup>
<PackageReference Include="GraphQL.Server.Ui.GraphiQL" Version="8.3.3" /> <PackageReference Include="GraphQL.Server.Ui.GraphiQL" Version="8.3.3" />
<PackageReference Include="HotChocolate.AspNetCore" Version="15.1.11" /> <PackageReference Include="HotChocolate.AspNetCore" Version="15.1.11" />
<PackageReference Include="HotChocolate.AspNetCore.Authorization" Version="15.1.11" />
<PackageReference Include="HotChocolate.AspNetCore.CommandLine" Version="15.1.11" /> <PackageReference Include="HotChocolate.AspNetCore.CommandLine" Version="15.1.11" />
<PackageReference Include="HotChocolate.Data.EntityFramework" Version="15.1.11" /> <PackageReference Include="HotChocolate.Data.EntityFramework" Version="15.1.11" />
<PackageReference Include="HotChocolate.Types.Scalars" Version="15.1.11" /> <PackageReference Include="HotChocolate.Types.Scalars" Version="15.1.11" />
@@ -18,6 +19,7 @@
<PrivateAssets>all</PrivateAssets> <PrivateAssets>all</PrivateAssets>
<IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets> <IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
</PackageReference> </PackageReference>
<PackageReference Include="Microsoft.EntityFrameworkCore.InMemory" Version="9.0.11" />
<PackageReference Include="Microsoft.EntityFrameworkCore.Relational" Version="9.0.11" /> <PackageReference Include="Microsoft.EntityFrameworkCore.Relational" Version="9.0.11" />
<PackageReference Include="Microsoft.EntityFrameworkCore.Tools" Version="9.0.11"> <PackageReference Include="Microsoft.EntityFrameworkCore.Tools" Version="9.0.11">
<PrivateAssets>all</PrivateAssets> <PrivateAssets>all</PrivateAssets>
@@ -28,6 +30,7 @@
<PackageReference Include="Npgsql.EntityFrameworkCore.PostgreSQL" Version="9.0.4" /> <PackageReference Include="Npgsql.EntityFrameworkCore.PostgreSQL" Version="9.0.4" />
<PackageReference Include="Npgsql.EntityFrameworkCore.PostgreSQL.NodaTime" Version="9.0.4" /> <PackageReference Include="Npgsql.EntityFrameworkCore.PostgreSQL.NodaTime" Version="9.0.4" />
<PackageReference Include="RabbitMQ.Client" Version="7.2.0" /> <PackageReference Include="RabbitMQ.Client" Version="7.2.0" />
<PackageReference Include="Microsoft.AspNetCore.Authentication.JwtBearer" Version="8.0.11" />
</ItemGroup> </ItemGroup>
<ItemGroup> <ItemGroup>

View File

@@ -0,0 +1,13 @@
namespace FictionArchive.Service.Shared.Models.Authentication;
public class OidcConfiguration
{
public string Authority { get; set; } = string.Empty;
public string ClientId { get; set; } = string.Empty;
public string Audience { get; set; } = string.Empty;
public string? ValidIssuer { get; set; }
public bool ValidateIssuer { get; set; } = true;
public bool ValidateAudience { get; set; } = true;
public bool ValidateLifetime { get; set; } = true;
public bool ValidateIssuerSigningKey { get; set; } = true;
}

View File

@@ -0,0 +1,22 @@
namespace FictionArchive.Service.Shared;
/// <summary>
/// Detects if the application is running in schema export mode (for HotChocolate CLI commands).
/// In this mode, infrastructure like RabbitMQ and databases should not be initialized.
/// </summary>
public static class SchemaExportDetector
{
/// <summary>
/// Checks if the current run is a schema export command.
/// </summary>
/// <param name="args">Command line arguments passed to Main()</param>
/// <returns>True if running schema export, false otherwise</returns>
public static bool IsSchemaExportMode(string[] args)
{
// HotChocolate CLI pattern: "schema export" after "--" delimiter
// Handles: dotnet run -- schema export --output schema.graphql
var normalizedArgs = args.SkipWhile(a => a == "--").ToArray();
return normalizedArgs.Length > 0 &&
normalizedArgs[0].Equals("schema", StringComparison.OrdinalIgnoreCase);
}
}

View File

@@ -5,11 +5,13 @@ using FictionArchive.Service.TranslationService.Models.Enums;
using FictionArchive.Service.TranslationService.Services; using FictionArchive.Service.TranslationService.Services;
using FictionArchive.Service.TranslationService.Services.Database; using FictionArchive.Service.TranslationService.Services.Database;
using FictionArchive.Service.TranslationService.Services.TranslationEngines; using FictionArchive.Service.TranslationService.Services.TranslationEngines;
using HotChocolate.Authorization;
namespace FictionArchive.Service.TranslationService.GraphQL; namespace FictionArchive.Service.TranslationService.GraphQL;
public class Mutation public class Mutation
{ {
[Authorize]
public async Task<TranslationResult> TranslateText(string text, Language from, Language to, string translationEngineKey, TranslationEngineService translationEngineService) public async Task<TranslationResult> TranslateText(string text, Language from, Language to, string translationEngineKey, TranslationEngineService translationEngineService)
{ {
var result = await translationEngineService.Translate(from, to, text, translationEngineKey); var result = await translationEngineService.Translate(from, to, text, translationEngineKey);

View File

@@ -2,12 +2,14 @@ using FictionArchive.Service.TranslationService.Models;
using FictionArchive.Service.TranslationService.Models.Database; using FictionArchive.Service.TranslationService.Models.Database;
using FictionArchive.Service.TranslationService.Services.Database; using FictionArchive.Service.TranslationService.Services.Database;
using FictionArchive.Service.TranslationService.Services.TranslationEngines; using FictionArchive.Service.TranslationService.Services.TranslationEngines;
using HotChocolate.Authorization;
using Microsoft.EntityFrameworkCore; using Microsoft.EntityFrameworkCore;
namespace FictionArchive.Service.TranslationService.GraphQL; namespace FictionArchive.Service.TranslationService.GraphQL;
public class Query public class Query
{ {
[Authorize]
[UseFiltering] [UseFiltering]
[UseSorting] [UseSorting]
public IEnumerable<TranslationEngineDescriptor> GetTranslationEngines(IEnumerable<ITranslationEngine> engines) public IEnumerable<TranslationEngineDescriptor> GetTranslationEngines(IEnumerable<ITranslationEngine> engines)
@@ -15,6 +17,7 @@ public class Query
return engines.Select(engine => engine.Descriptor); return engines.Select(engine => engine.Descriptor);
} }
[Authorize]
[UsePaging] [UsePaging]
[UseProjection] [UseProjection]
[UseFiltering] [UseFiltering]

View File

@@ -1,5 +1,6 @@
using DeepL; using DeepL;
using FictionArchive.Common.Extensions; using FictionArchive.Common.Extensions;
using FictionArchive.Service.Shared;
using FictionArchive.Service.Shared.Extensions; using FictionArchive.Service.Shared.Extensions;
using FictionArchive.Service.Shared.Services.EventBus.Implementations; using FictionArchive.Service.Shared.Services.EventBus.Implementations;
using FictionArchive.Service.Shared.Services.GraphQL; using FictionArchive.Service.Shared.Services.GraphQL;
@@ -18,6 +19,8 @@ public class Program
{ {
public static void Main(string[] args) public static void Main(string[] args)
{ {
var isSchemaExport = SchemaExportDetector.IsSchemaExportMode(args);
var builder = WebApplication.CreateBuilder(args); var builder = WebApplication.CreateBuilder(args);
builder.AddLocalAppsettings(); builder.AddLocalAppsettings();
@@ -25,24 +28,30 @@ public class Program
#region Event Bus #region Event Bus
builder.Services.AddRabbitMQ(opt => if (!isSchemaExport)
{ {
builder.Configuration.GetSection("RabbitMQ").Bind(opt); builder.Services.AddRabbitMQ(opt =>
}) {
.Subscribe<TranslationRequestCreatedEvent, TranslationRequestCreatedEventHandler>(); builder.Configuration.GetSection("RabbitMQ").Bind(opt);
})
.Subscribe<TranslationRequestCreatedEvent, TranslationRequestCreatedEventHandler>();
}
#endregion #endregion
#region Database #region Database
builder.Services.RegisterDbContext<TranslationServiceDbContext>(builder.Configuration.GetConnectionString("DefaultConnection")); builder.Services.RegisterDbContext<TranslationServiceDbContext>(
builder.Configuration.GetConnectionString("DefaultConnection"),
skipInfrastructure: isSchemaExport);
#endregion #endregion
#region GraphQL #region GraphQL
builder.Services.AddDefaultGraphQl<Query, Mutation>(); builder.Services.AddDefaultGraphQl<Query, Mutation>()
.AddAuthorization();
#endregion #endregion
@@ -58,11 +67,16 @@ public class Program
#endregion #endregion
// Authentication & Authorization
builder.Services.AddOidcAuthentication(builder.Configuration);
builder.Services.AddFictionArchiveAuthorization();
var app = builder.Build(); var app = builder.Build();
// Update database // Update database (skip in schema export mode)
using (var scope = app.Services.CreateScope()) if (!isSchemaExport)
{ {
using var scope = app.Services.CreateScope();
var dbContext = scope.ServiceProvider.GetRequiredService<TranslationServiceDbContext>(); var dbContext = scope.ServiceProvider.GetRequiredService<TranslationServiceDbContext>();
dbContext.UpdateDatabase(); dbContext.UpdateDatabase();
} }
@@ -71,6 +85,9 @@ public class Program
app.MapHealthChecks("/healthz"); app.MapHealthChecks("/healthz");
app.UseAuthentication();
app.UseAuthorization();
app.MapGraphQL(); app.MapGraphQL();
app.RunWithGraphQLCommands(args); app.RunWithGraphQLCommands(args);

View File

@@ -15,5 +15,15 @@
"ConnectionString": "amqp://localhost", "ConnectionString": "amqp://localhost",
"ClientIdentifier": "TranslationService" "ClientIdentifier": "TranslationService"
}, },
"AllowedHosts": "*" "AllowedHosts": "*",
"OIDC": {
"Authority": "https://auth.orfl.xyz/application/o/fiction-archive/",
"ClientId": "fictionarchive-api",
"Audience": "fictionarchive-api",
"ValidIssuer": "https://auth.orfl.xyz/application/o/fiction-archive/",
"ValidateIssuer": true,
"ValidateAudience": true,
"ValidateLifetime": true,
"ValidateIssuerSigningKey": true
}
} }

View File

@@ -1,10 +1,13 @@
using FictionArchive.Service.Shared.Constants;
using FictionArchive.Service.UserService.Models.Database; using FictionArchive.Service.UserService.Models.Database;
using FictionArchive.Service.UserService.Services; using FictionArchive.Service.UserService.Services;
using HotChocolate.Authorization;
namespace FictionArchive.Service.UserService.GraphQL; namespace FictionArchive.Service.UserService.GraphQL;
public class Mutation public class Mutation
{ {
[Authorize(Roles = [AuthorizationConstants.Roles.Admin])]
public async Task<User> RegisterUser(string username, string email, string oAuthProviderId, public async Task<User> RegisterUser(string username, string email, string oAuthProviderId,
string? inviterOAuthProviderId, UserManagementService userManagementService) string? inviterOAuthProviderId, UserManagementService userManagementService)
{ {

View File

@@ -1,10 +1,12 @@
using FictionArchive.Service.UserService.Models.Database; using FictionArchive.Service.UserService.Models.Database;
using FictionArchive.Service.UserService.Services; using FictionArchive.Service.UserService.Services;
using HotChocolate.Authorization;
namespace FictionArchive.Service.UserService.GraphQL; namespace FictionArchive.Service.UserService.GraphQL;
public class Query public class Query
{ {
[Authorize]
public async Task<IQueryable<User>> GetUsers(UserManagementService userManagementService) public async Task<IQueryable<User>> GetUsers(UserManagementService userManagementService)
{ {
return userManagementService.GetUsers(); return userManagementService.GetUsers();

View File

@@ -1,3 +1,5 @@
using FictionArchive.Common.Extensions;
using FictionArchive.Service.Shared;
using FictionArchive.Service.Shared.Extensions; using FictionArchive.Service.Shared.Extensions;
using FictionArchive.Service.Shared.Services.EventBus.Implementations; using FictionArchive.Service.Shared.Services.EventBus.Implementations;
using FictionArchive.Service.UserService.GraphQL; using FictionArchive.Service.UserService.GraphQL;
@@ -11,38 +13,55 @@ public class Program
{ {
public static void Main(string[] args) public static void Main(string[] args)
{ {
var isSchemaExport = SchemaExportDetector.IsSchemaExportMode(args);
var builder = WebApplication.CreateBuilder(args); var builder = WebApplication.CreateBuilder(args);
builder.AddLocalAppsettings();
#region Event Bus #region Event Bus
builder.Services.AddRabbitMQ(opt => if (!isSchemaExport)
{ {
builder.Configuration.GetSection("RabbitMQ").Bind(opt); builder.Services.AddRabbitMQ(opt =>
}) {
.Subscribe<AuthUserAddedEvent, AuthUserAddedEventHandler>(); builder.Configuration.GetSection("RabbitMQ").Bind(opt);
})
.Subscribe<AuthUserAddedEvent, AuthUserAddedEventHandler>();
}
#endregion #endregion
#region GraphQL #region GraphQL
builder.Services.AddDefaultGraphQl<Query, Mutation>(); builder.Services.AddDefaultGraphQl<Query, Mutation>()
.AddAuthorization();
#endregion #endregion
builder.Services.RegisterDbContext<UserServiceDbContext>(builder.Configuration.GetConnectionString("DefaultConnection")); builder.Services.RegisterDbContext<UserServiceDbContext>(
builder.Configuration.GetConnectionString("DefaultConnection"),
skipInfrastructure: isSchemaExport);
builder.Services.AddTransient<UserManagementService>(); builder.Services.AddTransient<UserManagementService>();
builder.Services.AddHealthChecks(); builder.Services.AddHealthChecks();
// Authentication & Authorization
builder.Services.AddOidcAuthentication(builder.Configuration);
builder.Services.AddFictionArchiveAuthorization();
var app = builder.Build(); var app = builder.Build();
// Update database // Update database (skip in schema export mode)
using (var scope = app.Services.CreateScope()) if (!isSchemaExport)
{ {
using var scope = app.Services.CreateScope();
var dbContext = scope.ServiceProvider.GetRequiredService<UserServiceDbContext>(); var dbContext = scope.ServiceProvider.GetRequiredService<UserServiceDbContext>();
dbContext.UpdateDatabase(); dbContext.UpdateDatabase();
} }
app.UseAuthentication();
app.UseAuthorization();
app.MapGraphQL(); app.MapGraphQL();
app.MapHealthChecks("/healthz"); app.MapHealthChecks("/healthz");

View File

@@ -12,5 +12,15 @@
"ConnectionString": "amqp://localhost", "ConnectionString": "amqp://localhost",
"ClientIdentifier": "UserService" "ClientIdentifier": "UserService"
}, },
"AllowedHosts": "*" "AllowedHosts": "*",
"OIDC": {
"Authority": "https://auth.orfl.xyz/application/o/fiction-archive/",
"ClientId": "fictionarchive-api",
"Audience": "fictionarchive-api",
"ValidIssuer": "https://auth.orfl.xyz/application/o/fiction-archive/",
"ValidateIssuer": true,
"ValidateAudience": true,
"ValidateLifetime": true,
"ValidateIssuerSigningKey": true
}
} }

View File

@@ -34,15 +34,18 @@ services:
# Backend Services # Backend Services
# =========================================== # ===========================================
novel-service: novel-service:
build: image: git.orfl.xyz/conco/fictionarchive-novel-service:latest
context: .
dockerfile: FictionArchive.Service.NovelService/Dockerfile
environment: environment:
ConnectionStrings__DefaultConnection: Host=postgres;Database=FictionArchive_NovelService;Username=${POSTGRES_USER:-postgres};Password=${POSTGRES_PASSWORD:-postgres} ConnectionStrings__DefaultConnection: Host=postgres;Database=FictionArchive_NovelService;Username=${POSTGRES_USER:-postgres};Password=${POSTGRES_PASSWORD:-postgres}
ConnectionStrings__RabbitMQ: amqp://${RABBITMQ_USER:-guest}:${RABBITMQ_PASSWORD:-guest}@rabbitmq ConnectionStrings__RabbitMQ: amqp://${RABBITMQ_USER:-guest}:${RABBITMQ_PASSWORD:-guest}@rabbitmq
Novelpia__Username: ${NOVELPIA_USERNAME} Novelpia__Username: ${NOVELPIA_USERNAME}
Novelpia__Password: ${NOVELPIA_PASSWORD} Novelpia__Password: ${NOVELPIA_PASSWORD}
NovelUpdateService__PendingImageUrl: https://files.fictionarchive.orfl.xyz/api/pendingupload.png NovelUpdateService__PendingImageUrl: https://files.fictionarchive.orfl.xyz/api/pendingupload.png
healthcheck:
test: ["CMD", "wget", "--no-verbose", "--tries=1", "--spider", "http://localhost:8080/healthz"]
interval: 30s
timeout: 10s
retries: 3
depends_on: depends_on:
postgres: postgres:
condition: service_healthy condition: service_healthy
@@ -51,13 +54,16 @@ services:
restart: unless-stopped restart: unless-stopped
translation-service: translation-service:
build: image: git.orfl.xyz/conco/fictionarchive-translation-service:latest
context: .
dockerfile: FictionArchive.Service.TranslationService/Dockerfile
environment: environment:
ConnectionStrings__DefaultConnection: Host=postgres;Database=FictionArchive_TranslationService;Username=${POSTGRES_USER:-postgres};Password=${POSTGRES_PASSWORD:-postgres} ConnectionStrings__DefaultConnection: Host=postgres;Database=FictionArchive_TranslationService;Username=${POSTGRES_USER:-postgres};Password=${POSTGRES_PASSWORD:-postgres}
ConnectionStrings__RabbitMQ: amqp://${RABBITMQ_USER:-guest}:${RABBITMQ_PASSWORD:-guest}@rabbitmq ConnectionStrings__RabbitMQ: amqp://${RABBITMQ_USER:-guest}:${RABBITMQ_PASSWORD:-guest}@rabbitmq
DeepL__ApiKey: ${DEEPL_API_KEY} DeepL__ApiKey: ${DEEPL_API_KEY}
healthcheck:
test: ["CMD", "wget", "--no-verbose", "--tries=1", "--spider", "http://localhost:8080/healthz"]
interval: 30s
timeout: 10s
retries: 3
depends_on: depends_on:
postgres: postgres:
condition: service_healthy condition: service_healthy
@@ -66,12 +72,15 @@ services:
restart: unless-stopped restart: unless-stopped
scheduler-service: scheduler-service:
build: image: git.orfl.xyz/conco/fictionarchive-scheduler-service:latest
context: .
dockerfile: FictionArchive.Service.SchedulerService/Dockerfile
environment: environment:
ConnectionStrings__DefaultConnection: Host=postgres;Database=FictionArchive_SchedulerService;Username=${POSTGRES_USER:-postgres};Password=${POSTGRES_PASSWORD:-postgres} ConnectionStrings__DefaultConnection: Host=postgres;Database=FictionArchive_SchedulerService;Username=${POSTGRES_USER:-postgres};Password=${POSTGRES_PASSWORD:-postgres}
ConnectionStrings__RabbitMQ: amqp://${RABBITMQ_USER:-guest}:${RABBITMQ_PASSWORD:-guest}@rabbitmq ConnectionStrings__RabbitMQ: amqp://${RABBITMQ_USER:-guest}:${RABBITMQ_PASSWORD:-guest}@rabbitmq
healthcheck:
test: ["CMD", "wget", "--no-verbose", "--tries=1", "--spider", "http://localhost:8080/healthz"]
interval: 30s
timeout: 10s
retries: 3
depends_on: depends_on:
postgres: postgres:
condition: service_healthy condition: service_healthy
@@ -80,12 +89,15 @@ services:
restart: unless-stopped restart: unless-stopped
user-service: user-service:
build: image: git.orfl.xyz/conco/fictionarchive-user-service:latest
context: .
dockerfile: FictionArchive.Service.UserService/Dockerfile
environment: environment:
ConnectionStrings__DefaultConnection: Host=postgres;Database=FictionArchive_UserService;Username=${POSTGRES_USER:-postgres};Password=${POSTGRES_PASSWORD:-postgres} ConnectionStrings__DefaultConnection: Host=postgres;Database=FictionArchive_UserService;Username=${POSTGRES_USER:-postgres};Password=${POSTGRES_PASSWORD:-postgres}
ConnectionStrings__RabbitMQ: amqp://${RABBITMQ_USER:-guest}:${RABBITMQ_PASSWORD:-guest}@rabbitmq ConnectionStrings__RabbitMQ: amqp://${RABBITMQ_USER:-guest}:${RABBITMQ_PASSWORD:-guest}@rabbitmq
healthcheck:
test: ["CMD", "wget", "--no-verbose", "--tries=1", "--spider", "http://localhost:8080/healthz"]
interval: 30s
timeout: 10s
retries: 3
depends_on: depends_on:
postgres: postgres:
condition: service_healthy condition: service_healthy
@@ -94,20 +106,21 @@ services:
restart: unless-stopped restart: unless-stopped
authentication-service: authentication-service:
build: image: git.orfl.xyz/conco/fictionarchive-authentication-service:latest
context: .
dockerfile: FictionArchive.Service.AuthenticationService/Dockerfile
environment: environment:
ConnectionStrings__RabbitMQ: amqp://${RABBITMQ_USER:-guest}:${RABBITMQ_PASSWORD:-guest}@rabbitmq ConnectionStrings__RabbitMQ: amqp://${RABBITMQ_USER:-guest}:${RABBITMQ_PASSWORD:-guest}@rabbitmq
healthcheck:
test: ["CMD", "wget", "--no-verbose", "--tries=1", "--spider", "http://localhost:8080/healthz"]
interval: 30s
timeout: 10s
retries: 3
depends_on: depends_on:
rabbitmq: rabbitmq:
condition: service_healthy condition: service_healthy
restart: unless-stopped restart: unless-stopped
file-service: file-service:
build: image: git.orfl.xyz/conco/fictionarchive-file-service:latest
context: .
dockerfile: FictionArchive.Service.FileService/Dockerfile
environment: environment:
ConnectionStrings__RabbitMQ: amqp://${RABBITMQ_USER:-guest}:${RABBITMQ_PASSWORD:-guest}@rabbitmq ConnectionStrings__RabbitMQ: amqp://${RABBITMQ_USER:-guest}:${RABBITMQ_PASSWORD:-guest}@rabbitmq
S3__Endpoint: ${S3_ENDPOINT:-https://s3.orfl.xyz} S3__Endpoint: ${S3_ENDPOINT:-https://s3.orfl.xyz}
@@ -115,6 +128,14 @@ services:
S3__AccessKey: ${S3_ACCESS_KEY} S3__AccessKey: ${S3_ACCESS_KEY}
S3__SecretKey: ${S3_SECRET_KEY} S3__SecretKey: ${S3_SECRET_KEY}
Proxy__BaseUrl: https://files.orfl.xyz/api Proxy__BaseUrl: https://files.orfl.xyz/api
OIDC__Authority: https://auth.orfl.xyz/application/o/fictionarchive/
OIDC__ClientId: fictionarchive-files
OIDC__Audience: fictionarchive-api
healthcheck:
test: ["CMD", "wget", "--no-verbose", "--tries=1", "--spider", "http://localhost:8080/healthz"]
interval: 30s
timeout: 10s
retries: 3
labels: labels:
- "traefik.enable=true" - "traefik.enable=true"
- "traefik.http.routers.file-service.rule=Host(`files.orfl.xyz`)" - "traefik.http.routers.file-service.rule=Host(`files.orfl.xyz`)"
@@ -130,11 +151,17 @@ services:
# API Gateway # API Gateway
# =========================================== # ===========================================
api-gateway: api-gateway:
build: image: git.orfl.xyz/conco/fictionarchive-api:latest
context: .
dockerfile: FictionArchive.API/Dockerfile
environment: environment:
ConnectionStrings__RabbitMQ: amqp://${RABBITMQ_USER:-guest}:${RABBITMQ_PASSWORD:-guest}@rabbitmq ConnectionStrings__RabbitMQ: amqp://${RABBITMQ_USER:-guest}:${RABBITMQ_PASSWORD:-guest}@rabbitmq
OIDC__Authority: https://auth.orfl.xyz/application/o/fictionarchive/
OIDC__ClientId: fictionarchive-api
OIDC__Audience: fictionarchive-api
healthcheck:
test: ["CMD", "wget", "--no-verbose", "--tries=1", "--spider", "http://localhost:8080/healthz"]
interval: 30s
timeout: 10s
retries: 3
labels: labels:
- "traefik.enable=true" - "traefik.enable=true"
- "traefik.http.routers.api-gateway.rule=Host(`api.fictionarchive.orfl.xyz`)" - "traefik.http.routers.api-gateway.rule=Host(`api.fictionarchive.orfl.xyz`)"
@@ -154,15 +181,12 @@ services:
# Frontend # Frontend
# =========================================== # ===========================================
frontend: frontend:
build: image: git.orfl.xyz/conco/fictionarchive-frontend:latest
context: ./fictionarchive-web healthcheck:
dockerfile: Dockerfile test: ["CMD", "wget", "--no-verbose", "--tries=1", "--spider", "http://localhost/"]
args: interval: 30s
VITE_GRAPHQL_URI: https://api.fictionarchive.orfl.xyz/graphql/ timeout: 10s
VITE_OIDC_AUTHORITY: ${OIDC_AUTHORITY:-https://auth.orfl.xyz/application/o/fiction-archive/} retries: 3
VITE_OIDC_CLIENT_ID: ${OIDC_CLIENT_ID}
VITE_OIDC_REDIRECT_URI: https://fictionarchive.orfl.xyz/
VITE_OIDC_POST_LOGOUT_REDIRECT_URI: https://fictionarchive.orfl.xyz/
labels: labels:
- "traefik.enable=true" - "traefik.enable=true"
- "traefik.http.routers.frontend.rule=Host(`fictionarchive.orfl.xyz`)" - "traefik.http.routers.frontend.rule=Host(`fictionarchive.orfl.xyz`)"

View File

@@ -0,0 +1,40 @@
# Dependencies
node_modules
# Build output
dist
# Environment files
.env
.env.local
.env.*.local
# IDE and editor
.vscode
.idea
*.swp
*.swo
# Git
.git
.gitignore
# Logs
npm-debug.log*
yarn-debug.log*
yarn-error.log*
# Test coverage
coverage
# Docker
Dockerfile
.dockerignore
docker-compose*
# Documentation
README.md
*.md
# TypeScript build info
*.tsbuildinfo

View File

@@ -12,9 +12,6 @@ dist
dist-ssr dist-ssr
*.local *.local
# Generated GraphQL artifacts
src/__generated__/
# Editor directories and files # Editor directories and files
.vscode/* .vscode/*
!.vscode/extensions.json !.vscode/extensions.json

View File

@@ -25,8 +25,8 @@ VITE_CODEGEN_TOKEN=your_api_token
## Scripts ## Scripts
- `npm run dev`: start Vite dev server. - `npm run dev`: start Vite dev server.
- `npm run build`: type-check + build (runs codegen first via `prebuild`). - `npm run build`: type-check + production build.
- `npm run codegen`: generate typed hooks from `src/**/*.graphql` into `src/__generated__/graphql.ts`. - `npm run codegen`: generate typed hooks from `src/**/*.graphql` into `src/__generated__/graphql.ts`. **Run this manually after changing GraphQL operations or when the gateway schema changes.**
## Project notes ## Project notes
@@ -39,4 +39,4 @@ VITE_CODEGEN_TOKEN=your_api_token
- Default schema URL: `CODEGEN_SCHEMA_URL` (falls back to `VITE_GRAPHQL_URI`, then `https://localhost:5001/graphql`). - Default schema URL: `CODEGEN_SCHEMA_URL` (falls back to `VITE_GRAPHQL_URI`, then `https://localhost:5001/graphql`).
- Add `VITE_CODEGEN_TOKEN` (or `CODEGEN_TOKEN`) if your gateway requires a bearer token during introspection. - Add `VITE_CODEGEN_TOKEN` (or `CODEGEN_TOKEN`) if your gateway requires a bearer token during introspection.
- Generated outputs land in `src/__generated__/graphql.ts` (git-ignored). Run `npm run codegen` after schema/operation changes or rely on `npm run build` (runs `prebuild`). - Generated outputs land in `src/__generated__/graphql.ts` (committed to git). Run `npm run codegen` after schema/operation changes.

View File

@@ -22,15 +22,17 @@ const config: CodegenConfig = {
plugins: [ plugins: [
'typescript', 'typescript',
'typescript-operations', 'typescript-operations',
'typescript-react-apollo', 'typed-document-node',
], ],
config: { config: {
withHooks: true, avoidOptionals: {
avoidOptionals: true, field: true,
dedupeFragments: true, inputValue: false,
},
enumsAsConst: true,
maybeValue: 'T | null', maybeValue: 'T | null',
skipTypename: true, skipTypename: true,
apolloReactHooksImportFrom: '@apollo/client/react', useTypeImports: true,
}, },
}, },
}, },

View File

@@ -6,7 +6,7 @@ import tseslint from 'typescript-eslint'
import { defineConfig, globalIgnores } from 'eslint/config' import { defineConfig, globalIgnores } from 'eslint/config'
export default defineConfig([ export default defineConfig([
globalIgnores(['dist']), globalIgnores(['dist', 'src/__generated__']),
{ {
files: ['**/*.{ts,tsx}'], files: ['**/*.{ts,tsx}'],
extends: [ extends: [

File diff suppressed because it is too large Load Diff

View File

@@ -6,7 +6,6 @@
"scripts": { "scripts": {
"dev": "vite", "dev": "vite",
"build": "tsc -b && vite build", "build": "tsc -b && vite build",
"prebuild": "npm run codegen",
"codegen": "graphql-codegen --config codegen.ts -r dotenv/config --use-system-ca", "codegen": "graphql-codegen --config codegen.ts -r dotenv/config --use-system-ca",
"lint": "eslint .", "lint": "eslint .",
"preview": "vite preview" "preview": "vite preview"
@@ -17,18 +16,18 @@
"class-variance-authority": "^0.7.1", "class-variance-authority": "^0.7.1",
"clsx": "^2.1.1", "clsx": "^2.1.1",
"graphql": "^16.12.0", "graphql": "^16.12.0",
"react-router-dom": "^6.27.0",
"oidc-client-ts": "^3.4.1", "oidc-client-ts": "^3.4.1",
"react": "^19.2.0", "react": "^19.2.0",
"react-dom": "^19.2.0", "react-dom": "^19.2.0",
"react-router-dom": "^6.27.0",
"tailwind-merge": "^2.5.4" "tailwind-merge": "^2.5.4"
}, },
"devDependencies": { "devDependencies": {
"@eslint/js": "^9.39.1",
"@graphql-codegen/cli": "^5.0.3", "@graphql-codegen/cli": "^5.0.3",
"@graphql-codegen/typed-document-node": "^6.1.1",
"@graphql-codegen/typescript": "^4.0.9", "@graphql-codegen/typescript": "^4.0.9",
"@graphql-codegen/typescript-operations": "^4.0.9", "@graphql-codegen/typescript-operations": "^4.0.9",
"@graphql-codegen/typescript-react-apollo": "^4.0.9",
"@eslint/js": "^9.39.1",
"@types/node": "^24.10.1", "@types/node": "^24.10.1",
"@types/react": "^19.2.5", "@types/react": "^19.2.5",
"@types/react-dom": "^19.2.3", "@types/react-dom": "^19.2.3",

View File

@@ -0,0 +1,774 @@
import type { TypedDocumentNode as DocumentNode } from '@graphql-typed-document-node/core';
export type Maybe<T> = T | null;
export type InputMaybe<T> = T | null;
export type Exact<T extends { [key: string]: unknown }> = { [K in keyof T]: T[K] };
export type MakeOptional<T, K extends keyof T> = Omit<T, K> & { [SubKey in K]?: Maybe<T[SubKey]> };
export type MakeMaybe<T, K extends keyof T> = Omit<T, K> & { [SubKey in K]: Maybe<T[SubKey]> };
export type MakeEmpty<T extends { [key: string]: unknown }, K extends keyof T> = { [_ in K]?: never };
export type Incremental<T> = T | { [P in keyof T]?: P extends ' $fragmentName' | '__typename' ? T[P] : never };
/** All built-in and custom scalars, mapped to their actual values */
export type Scalars = {
ID: { input: string; output: string; }
String: { input: string; output: string; }
Boolean: { input: boolean; output: boolean; }
Int: { input: number; output: number; }
Float: { input: number; output: number; }
Instant: { input: any; output: any; }
UUID: { input: any; output: any; }
UnsignedInt: { input: any; output: any; }
};
export type Chapter = {
body: LocalizationKey;
createdTime: Scalars['Instant']['output'];
id: Scalars['UnsignedInt']['output'];
images: Array<Image>;
lastUpdatedTime: Scalars['Instant']['output'];
name: LocalizationKey;
order: Scalars['UnsignedInt']['output'];
revision: Scalars['UnsignedInt']['output'];
url: Maybe<Scalars['String']['output']>;
};
export type ChapterFilterInput = {
and?: InputMaybe<Array<ChapterFilterInput>>;
body?: InputMaybe<LocalizationKeyFilterInput>;
createdTime?: InputMaybe<InstantFilterInput>;
id?: InputMaybe<UnsignedIntOperationFilterInputType>;
images?: InputMaybe<ListFilterInputTypeOfImageFilterInput>;
lastUpdatedTime?: InputMaybe<InstantFilterInput>;
name?: InputMaybe<LocalizationKeyFilterInput>;
or?: InputMaybe<Array<ChapterFilterInput>>;
order?: InputMaybe<UnsignedIntOperationFilterInputType>;
revision?: InputMaybe<UnsignedIntOperationFilterInputType>;
url?: InputMaybe<StringOperationFilterInput>;
};
export type ChapterPullRequestedEvent = {
chapterNumber: Scalars['UnsignedInt']['output'];
novelId: Scalars['UnsignedInt']['output'];
};
export type ChapterSortInput = {
body?: InputMaybe<LocalizationKeySortInput>;
createdTime?: InputMaybe<SortEnumType>;
id?: InputMaybe<SortEnumType>;
lastUpdatedTime?: InputMaybe<SortEnumType>;
name?: InputMaybe<LocalizationKeySortInput>;
order?: InputMaybe<SortEnumType>;
revision?: InputMaybe<SortEnumType>;
url?: InputMaybe<SortEnumType>;
};
export type DeleteJobError = KeyNotFoundError;
export type DeleteJobInput = {
jobKey: Scalars['String']['input'];
};
export type DeleteJobPayload = {
boolean: Maybe<Scalars['Boolean']['output']>;
errors: Maybe<Array<DeleteJobError>>;
};
export type DuplicateNameError = Error & {
message: Scalars['String']['output'];
};
export type Error = {
message: Scalars['String']['output'];
};
export type FetchChapterContentsInput = {
chapterNumber: Scalars['UnsignedInt']['input'];
novelId: Scalars['UnsignedInt']['input'];
};
export type FetchChapterContentsPayload = {
chapterPullRequestedEvent: Maybe<ChapterPullRequestedEvent>;
};
export type FormatError = Error & {
message: Scalars['String']['output'];
};
export type Image = {
chapter: Maybe<Chapter>;
createdTime: Scalars['Instant']['output'];
id: Scalars['UUID']['output'];
lastUpdatedTime: Scalars['Instant']['output'];
newPath: Maybe<Scalars['String']['output']>;
originalPath: Scalars['String']['output'];
};
export type ImageFilterInput = {
and?: InputMaybe<Array<ImageFilterInput>>;
chapter?: InputMaybe<ChapterFilterInput>;
createdTime?: InputMaybe<InstantFilterInput>;
id?: InputMaybe<UuidOperationFilterInput>;
lastUpdatedTime?: InputMaybe<InstantFilterInput>;
newPath?: InputMaybe<StringOperationFilterInput>;
or?: InputMaybe<Array<ImageFilterInput>>;
originalPath?: InputMaybe<StringOperationFilterInput>;
};
export type ImageSortInput = {
chapter?: InputMaybe<ChapterSortInput>;
createdTime?: InputMaybe<SortEnumType>;
id?: InputMaybe<SortEnumType>;
lastUpdatedTime?: InputMaybe<SortEnumType>;
newPath?: InputMaybe<SortEnumType>;
originalPath?: InputMaybe<SortEnumType>;
};
export type ImportNovelInput = {
novelUrl: Scalars['String']['input'];
};
export type ImportNovelPayload = {
novelUpdateRequestedEvent: Maybe<NovelUpdateRequestedEvent>;
};
export type InstantFilterInput = {
and?: InputMaybe<Array<InstantFilterInput>>;
or?: InputMaybe<Array<InstantFilterInput>>;
};
export type JobKey = {
group: Scalars['String']['output'];
name: Scalars['String']['output'];
};
export type JobPersistenceError = Error & {
message: Scalars['String']['output'];
};
export type KeyNotFoundError = Error & {
message: Scalars['String']['output'];
};
export type KeyValuePairOfStringAndString = {
key: Scalars['String']['output'];
value: Scalars['String']['output'];
};
export const Language = {
Ch: 'CH',
En: 'EN',
Ja: 'JA',
Kr: 'KR'
} as const;
export type Language = typeof Language[keyof typeof Language];
export type LanguageOperationFilterInput = {
eq?: InputMaybe<Language>;
in?: InputMaybe<Array<Language>>;
neq?: InputMaybe<Language>;
nin?: InputMaybe<Array<Language>>;
};
export type ListFilterInputTypeOfChapterFilterInput = {
all?: InputMaybe<ChapterFilterInput>;
any?: InputMaybe<Scalars['Boolean']['input']>;
none?: InputMaybe<ChapterFilterInput>;
some?: InputMaybe<ChapterFilterInput>;
};
export type ListFilterInputTypeOfImageFilterInput = {
all?: InputMaybe<ImageFilterInput>;
any?: InputMaybe<Scalars['Boolean']['input']>;
none?: InputMaybe<ImageFilterInput>;
some?: InputMaybe<ImageFilterInput>;
};
export type ListFilterInputTypeOfLocalizationTextFilterInput = {
all?: InputMaybe<LocalizationTextFilterInput>;
any?: InputMaybe<Scalars['Boolean']['input']>;
none?: InputMaybe<LocalizationTextFilterInput>;
some?: InputMaybe<LocalizationTextFilterInput>;
};
export type ListFilterInputTypeOfNovelFilterInput = {
all?: InputMaybe<NovelFilterInput>;
any?: InputMaybe<Scalars['Boolean']['input']>;
none?: InputMaybe<NovelFilterInput>;
some?: InputMaybe<NovelFilterInput>;
};
export type ListFilterInputTypeOfNovelTagFilterInput = {
all?: InputMaybe<NovelTagFilterInput>;
any?: InputMaybe<Scalars['Boolean']['input']>;
none?: InputMaybe<NovelTagFilterInput>;
some?: InputMaybe<NovelTagFilterInput>;
};
export type LocalizationKey = {
createdTime: Scalars['Instant']['output'];
id: Scalars['UUID']['output'];
lastUpdatedTime: Scalars['Instant']['output'];
texts: Array<LocalizationText>;
};
export type LocalizationKeyFilterInput = {
and?: InputMaybe<Array<LocalizationKeyFilterInput>>;
createdTime?: InputMaybe<InstantFilterInput>;
id?: InputMaybe<UuidOperationFilterInput>;
lastUpdatedTime?: InputMaybe<InstantFilterInput>;
or?: InputMaybe<Array<LocalizationKeyFilterInput>>;
texts?: InputMaybe<ListFilterInputTypeOfLocalizationTextFilterInput>;
};
export type LocalizationKeySortInput = {
createdTime?: InputMaybe<SortEnumType>;
id?: InputMaybe<SortEnumType>;
lastUpdatedTime?: InputMaybe<SortEnumType>;
};
export type LocalizationText = {
createdTime: Scalars['Instant']['output'];
id: Scalars['UUID']['output'];
language: Language;
lastUpdatedTime: Scalars['Instant']['output'];
text: Scalars['String']['output'];
translationEngine: Maybe<TranslationEngine>;
};
export type LocalizationTextFilterInput = {
and?: InputMaybe<Array<LocalizationTextFilterInput>>;
createdTime?: InputMaybe<InstantFilterInput>;
id?: InputMaybe<UuidOperationFilterInput>;
language?: InputMaybe<LanguageOperationFilterInput>;
lastUpdatedTime?: InputMaybe<InstantFilterInput>;
or?: InputMaybe<Array<LocalizationTextFilterInput>>;
text?: InputMaybe<StringOperationFilterInput>;
translationEngine?: InputMaybe<TranslationEngineFilterInput>;
};
export type Mutation = {
deleteJob: DeleteJobPayload;
fetchChapterContents: FetchChapterContentsPayload;
importNovel: ImportNovelPayload;
registerUser: RegisterUserPayload;
runJob: RunJobPayload;
scheduleEventJob: ScheduleEventJobPayload;
translateText: TranslateTextPayload;
};
export type MutationDeleteJobArgs = {
input: DeleteJobInput;
};
export type MutationFetchChapterContentsArgs = {
input: FetchChapterContentsInput;
};
export type MutationImportNovelArgs = {
input: ImportNovelInput;
};
export type MutationRegisterUserArgs = {
input: RegisterUserInput;
};
export type MutationRunJobArgs = {
input: RunJobInput;
};
export type MutationScheduleEventJobArgs = {
input: ScheduleEventJobInput;
};
export type MutationTranslateTextArgs = {
input: TranslateTextInput;
};
export type Novel = {
author: Person;
chapters: Array<Chapter>;
coverImage: Maybe<Image>;
createdTime: Scalars['Instant']['output'];
description: LocalizationKey;
externalId: Scalars['String']['output'];
id: Scalars['UnsignedInt']['output'];
lastUpdatedTime: Scalars['Instant']['output'];
name: LocalizationKey;
rawLanguage: Language;
rawStatus: NovelStatus;
source: Source;
statusOverride: Maybe<NovelStatus>;
tags: Array<NovelTag>;
url: Scalars['String']['output'];
};
export type NovelFilterInput = {
and?: InputMaybe<Array<NovelFilterInput>>;
author?: InputMaybe<PersonFilterInput>;
chapters?: InputMaybe<ListFilterInputTypeOfChapterFilterInput>;
coverImage?: InputMaybe<ImageFilterInput>;
createdTime?: InputMaybe<InstantFilterInput>;
description?: InputMaybe<LocalizationKeyFilterInput>;
externalId?: InputMaybe<StringOperationFilterInput>;
id?: InputMaybe<UnsignedIntOperationFilterInputType>;
lastUpdatedTime?: InputMaybe<InstantFilterInput>;
name?: InputMaybe<LocalizationKeyFilterInput>;
or?: InputMaybe<Array<NovelFilterInput>>;
rawLanguage?: InputMaybe<LanguageOperationFilterInput>;
rawStatus?: InputMaybe<NovelStatusOperationFilterInput>;
source?: InputMaybe<SourceFilterInput>;
statusOverride?: InputMaybe<NullableOfNovelStatusOperationFilterInput>;
tags?: InputMaybe<ListFilterInputTypeOfNovelTagFilterInput>;
url?: InputMaybe<StringOperationFilterInput>;
};
export type NovelSortInput = {
author?: InputMaybe<PersonSortInput>;
coverImage?: InputMaybe<ImageSortInput>;
createdTime?: InputMaybe<SortEnumType>;
description?: InputMaybe<LocalizationKeySortInput>;
externalId?: InputMaybe<SortEnumType>;
id?: InputMaybe<SortEnumType>;
lastUpdatedTime?: InputMaybe<SortEnumType>;
name?: InputMaybe<LocalizationKeySortInput>;
rawLanguage?: InputMaybe<SortEnumType>;
rawStatus?: InputMaybe<SortEnumType>;
source?: InputMaybe<SourceSortInput>;
statusOverride?: InputMaybe<SortEnumType>;
url?: InputMaybe<SortEnumType>;
};
export const NovelStatus = {
Abandoned: 'ABANDONED',
Completed: 'COMPLETED',
Hiatus: 'HIATUS',
InProgress: 'IN_PROGRESS',
Unknown: 'UNKNOWN'
} as const;
export type NovelStatus = typeof NovelStatus[keyof typeof NovelStatus];
export type NovelStatusOperationFilterInput = {
eq?: InputMaybe<NovelStatus>;
in?: InputMaybe<Array<NovelStatus>>;
neq?: InputMaybe<NovelStatus>;
nin?: InputMaybe<Array<NovelStatus>>;
};
export type NovelTag = {
createdTime: Scalars['Instant']['output'];
displayName: LocalizationKey;
id: Scalars['UnsignedInt']['output'];
key: Scalars['String']['output'];
lastUpdatedTime: Scalars['Instant']['output'];
novels: Array<Novel>;
source: Maybe<Source>;
tagType: TagType;
};
export type NovelTagFilterInput = {
and?: InputMaybe<Array<NovelTagFilterInput>>;
createdTime?: InputMaybe<InstantFilterInput>;
displayName?: InputMaybe<LocalizationKeyFilterInput>;
id?: InputMaybe<UnsignedIntOperationFilterInputType>;
key?: InputMaybe<StringOperationFilterInput>;
lastUpdatedTime?: InputMaybe<InstantFilterInput>;
novels?: InputMaybe<ListFilterInputTypeOfNovelFilterInput>;
or?: InputMaybe<Array<NovelTagFilterInput>>;
source?: InputMaybe<SourceFilterInput>;
tagType?: InputMaybe<TagTypeOperationFilterInput>;
};
export type NovelUpdateRequestedEvent = {
novelUrl: Scalars['String']['output'];
};
/** A connection to a list of items. */
export type NovelsConnection = {
/** A list of edges. */
edges: Maybe<Array<NovelsEdge>>;
/** A flattened list of the nodes. */
nodes: Maybe<Array<Novel>>;
/** Information to aid in pagination. */
pageInfo: PageInfo;
};
/** An edge in a connection. */
export type NovelsEdge = {
/** A cursor for use in pagination. */
cursor: Scalars['String']['output'];
/** The item at the end of the edge. */
node: Novel;
};
export type NullableOfNovelStatusOperationFilterInput = {
eq?: InputMaybe<NovelStatus>;
in?: InputMaybe<Array<InputMaybe<NovelStatus>>>;
neq?: InputMaybe<NovelStatus>;
nin?: InputMaybe<Array<InputMaybe<NovelStatus>>>;
};
/** Information about pagination in a connection. */
export type PageInfo = {
/** When paginating forwards, the cursor to continue. */
endCursor: Maybe<Scalars['String']['output']>;
/** Indicates whether more edges exist following the set defined by the clients arguments. */
hasNextPage: Scalars['Boolean']['output'];
/** Indicates whether more edges exist prior the set defined by the clients arguments. */
hasPreviousPage: Scalars['Boolean']['output'];
/** When paginating backwards, the cursor to continue. */
startCursor: Maybe<Scalars['String']['output']>;
};
export type Person = {
createdTime: Scalars['Instant']['output'];
externalUrl: Maybe<Scalars['String']['output']>;
id: Scalars['UnsignedInt']['output'];
lastUpdatedTime: Scalars['Instant']['output'];
name: LocalizationKey;
};
export type PersonFilterInput = {
and?: InputMaybe<Array<PersonFilterInput>>;
createdTime?: InputMaybe<InstantFilterInput>;
externalUrl?: InputMaybe<StringOperationFilterInput>;
id?: InputMaybe<UnsignedIntOperationFilterInputType>;
lastUpdatedTime?: InputMaybe<InstantFilterInput>;
name?: InputMaybe<LocalizationKeyFilterInput>;
or?: InputMaybe<Array<PersonFilterInput>>;
};
export type PersonSortInput = {
createdTime?: InputMaybe<SortEnumType>;
externalUrl?: InputMaybe<SortEnumType>;
id?: InputMaybe<SortEnumType>;
lastUpdatedTime?: InputMaybe<SortEnumType>;
name?: InputMaybe<LocalizationKeySortInput>;
};
export type Query = {
jobs: Array<SchedulerJob>;
novels: Maybe<NovelsConnection>;
translationEngines: Array<TranslationEngineDescriptor>;
translationRequests: Maybe<TranslationRequestsConnection>;
users: Array<User>;
};
export type QueryNovelsArgs = {
after?: InputMaybe<Scalars['String']['input']>;
before?: InputMaybe<Scalars['String']['input']>;
first?: InputMaybe<Scalars['Int']['input']>;
last?: InputMaybe<Scalars['Int']['input']>;
order?: InputMaybe<Array<NovelSortInput>>;
where?: InputMaybe<NovelFilterInput>;
};
export type QueryTranslationEnginesArgs = {
order?: InputMaybe<Array<TranslationEngineDescriptorSortInput>>;
where?: InputMaybe<TranslationEngineDescriptorFilterInput>;
};
export type QueryTranslationRequestsArgs = {
after?: InputMaybe<Scalars['String']['input']>;
before?: InputMaybe<Scalars['String']['input']>;
first?: InputMaybe<Scalars['Int']['input']>;
last?: InputMaybe<Scalars['Int']['input']>;
order?: InputMaybe<Array<TranslationRequestSortInput>>;
where?: InputMaybe<TranslationRequestFilterInput>;
};
export type RegisterUserInput = {
email: Scalars['String']['input'];
inviterOAuthProviderId?: InputMaybe<Scalars['String']['input']>;
oAuthProviderId: Scalars['String']['input'];
username: Scalars['String']['input'];
};
export type RegisterUserPayload = {
user: Maybe<User>;
};
export type RunJobError = JobPersistenceError;
export type RunJobInput = {
jobKey: Scalars['String']['input'];
};
export type RunJobPayload = {
boolean: Maybe<Scalars['Boolean']['output']>;
errors: Maybe<Array<RunJobError>>;
};
export type ScheduleEventJobError = DuplicateNameError | FormatError;
export type ScheduleEventJobInput = {
cronSchedule: Scalars['String']['input'];
description: Scalars['String']['input'];
eventData: Scalars['String']['input'];
eventType: Scalars['String']['input'];
key: Scalars['String']['input'];
};
export type ScheduleEventJobPayload = {
errors: Maybe<Array<ScheduleEventJobError>>;
schedulerJob: Maybe<SchedulerJob>;
};
export type SchedulerJob = {
cronSchedule: Array<Scalars['String']['output']>;
description: Scalars['String']['output'];
jobData: Array<KeyValuePairOfStringAndString>;
jobKey: JobKey;
jobTypeName: Scalars['String']['output'];
};
export const SortEnumType = {
Asc: 'ASC',
Desc: 'DESC'
} as const;
export type SortEnumType = typeof SortEnumType[keyof typeof SortEnumType];
export type Source = {
createdTime: Scalars['Instant']['output'];
id: Scalars['UnsignedInt']['output'];
key: Scalars['String']['output'];
lastUpdatedTime: Scalars['Instant']['output'];
name: Scalars['String']['output'];
url: Scalars['String']['output'];
};
export type SourceFilterInput = {
and?: InputMaybe<Array<SourceFilterInput>>;
createdTime?: InputMaybe<InstantFilterInput>;
id?: InputMaybe<UnsignedIntOperationFilterInputType>;
key?: InputMaybe<StringOperationFilterInput>;
lastUpdatedTime?: InputMaybe<InstantFilterInput>;
name?: InputMaybe<StringOperationFilterInput>;
or?: InputMaybe<Array<SourceFilterInput>>;
url?: InputMaybe<StringOperationFilterInput>;
};
export type SourceSortInput = {
createdTime?: InputMaybe<SortEnumType>;
id?: InputMaybe<SortEnumType>;
key?: InputMaybe<SortEnumType>;
lastUpdatedTime?: InputMaybe<SortEnumType>;
name?: InputMaybe<SortEnumType>;
url?: InputMaybe<SortEnumType>;
};
export type StringOperationFilterInput = {
and?: InputMaybe<Array<StringOperationFilterInput>>;
contains?: InputMaybe<Scalars['String']['input']>;
endsWith?: InputMaybe<Scalars['String']['input']>;
eq?: InputMaybe<Scalars['String']['input']>;
in?: InputMaybe<Array<InputMaybe<Scalars['String']['input']>>>;
ncontains?: InputMaybe<Scalars['String']['input']>;
nendsWith?: InputMaybe<Scalars['String']['input']>;
neq?: InputMaybe<Scalars['String']['input']>;
nin?: InputMaybe<Array<InputMaybe<Scalars['String']['input']>>>;
nstartsWith?: InputMaybe<Scalars['String']['input']>;
or?: InputMaybe<Array<StringOperationFilterInput>>;
startsWith?: InputMaybe<Scalars['String']['input']>;
};
export const TagType = {
External: 'EXTERNAL',
Genre: 'GENRE',
System: 'SYSTEM',
UserDefined: 'USER_DEFINED'
} as const;
export type TagType = typeof TagType[keyof typeof TagType];
export type TagTypeOperationFilterInput = {
eq?: InputMaybe<TagType>;
in?: InputMaybe<Array<TagType>>;
neq?: InputMaybe<TagType>;
nin?: InputMaybe<Array<TagType>>;
};
export type TranslateTextInput = {
from: Language;
text: Scalars['String']['input'];
to: Language;
translationEngineKey: Scalars['String']['input'];
};
export type TranslateTextPayload = {
translationResult: Maybe<TranslationResult>;
};
export type TranslationEngine = {
createdTime: Scalars['Instant']['output'];
id: Scalars['UnsignedInt']['output'];
key: Scalars['String']['output'];
lastUpdatedTime: Scalars['Instant']['output'];
};
export type TranslationEngineDescriptor = {
displayName: Scalars['String']['output'];
key: Scalars['String']['output'];
};
export type TranslationEngineDescriptorFilterInput = {
and?: InputMaybe<Array<TranslationEngineDescriptorFilterInput>>;
displayName?: InputMaybe<StringOperationFilterInput>;
key?: InputMaybe<StringOperationFilterInput>;
or?: InputMaybe<Array<TranslationEngineDescriptorFilterInput>>;
};
export type TranslationEngineDescriptorSortInput = {
displayName?: InputMaybe<SortEnumType>;
key?: InputMaybe<SortEnumType>;
};
export type TranslationEngineFilterInput = {
and?: InputMaybe<Array<TranslationEngineFilterInput>>;
createdTime?: InputMaybe<InstantFilterInput>;
id?: InputMaybe<UnsignedIntOperationFilterInputType>;
key?: InputMaybe<StringOperationFilterInput>;
lastUpdatedTime?: InputMaybe<InstantFilterInput>;
or?: InputMaybe<Array<TranslationEngineFilterInput>>;
};
export type TranslationRequest = {
billedCharacterCount: Scalars['UnsignedInt']['output'];
createdTime: Scalars['Instant']['output'];
from: Language;
id: Scalars['UUID']['output'];
lastUpdatedTime: Scalars['Instant']['output'];
originalText: Scalars['String']['output'];
status: TranslationRequestStatus;
to: Language;
translatedText: Maybe<Scalars['String']['output']>;
translationEngineKey: Scalars['String']['output'];
};
export type TranslationRequestFilterInput = {
and?: InputMaybe<Array<TranslationRequestFilterInput>>;
billedCharacterCount?: InputMaybe<UnsignedIntOperationFilterInputType>;
createdTime?: InputMaybe<InstantFilterInput>;
from?: InputMaybe<LanguageOperationFilterInput>;
id?: InputMaybe<UuidOperationFilterInput>;
lastUpdatedTime?: InputMaybe<InstantFilterInput>;
or?: InputMaybe<Array<TranslationRequestFilterInput>>;
originalText?: InputMaybe<StringOperationFilterInput>;
status?: InputMaybe<TranslationRequestStatusOperationFilterInput>;
to?: InputMaybe<LanguageOperationFilterInput>;
translatedText?: InputMaybe<StringOperationFilterInput>;
translationEngineKey?: InputMaybe<StringOperationFilterInput>;
};
export type TranslationRequestSortInput = {
billedCharacterCount?: InputMaybe<SortEnumType>;
createdTime?: InputMaybe<SortEnumType>;
from?: InputMaybe<SortEnumType>;
id?: InputMaybe<SortEnumType>;
lastUpdatedTime?: InputMaybe<SortEnumType>;
originalText?: InputMaybe<SortEnumType>;
status?: InputMaybe<SortEnumType>;
to?: InputMaybe<SortEnumType>;
translatedText?: InputMaybe<SortEnumType>;
translationEngineKey?: InputMaybe<SortEnumType>;
};
export const TranslationRequestStatus = {
Failed: 'FAILED',
Pending: 'PENDING',
Success: 'SUCCESS'
} as const;
export type TranslationRequestStatus = typeof TranslationRequestStatus[keyof typeof TranslationRequestStatus];
export type TranslationRequestStatusOperationFilterInput = {
eq?: InputMaybe<TranslationRequestStatus>;
in?: InputMaybe<Array<TranslationRequestStatus>>;
neq?: InputMaybe<TranslationRequestStatus>;
nin?: InputMaybe<Array<TranslationRequestStatus>>;
};
/** A connection to a list of items. */
export type TranslationRequestsConnection = {
/** A list of edges. */
edges: Maybe<Array<TranslationRequestsEdge>>;
/** A flattened list of the nodes. */
nodes: Maybe<Array<TranslationRequest>>;
/** Information to aid in pagination. */
pageInfo: PageInfo;
};
/** An edge in a connection. */
export type TranslationRequestsEdge = {
/** A cursor for use in pagination. */
cursor: Scalars['String']['output'];
/** The item at the end of the edge. */
node: TranslationRequest;
};
export type TranslationResult = {
billedCharacterCount: Scalars['UnsignedInt']['output'];
from: Language;
originalText: Scalars['String']['output'];
status: TranslationRequestStatus;
to: Language;
translatedText: Maybe<Scalars['String']['output']>;
translationEngineKey: Scalars['String']['output'];
};
export type UnsignedIntOperationFilterInputType = {
eq?: InputMaybe<Scalars['UnsignedInt']['input']>;
gt?: InputMaybe<Scalars['UnsignedInt']['input']>;
gte?: InputMaybe<Scalars['UnsignedInt']['input']>;
in?: InputMaybe<Array<InputMaybe<Scalars['UnsignedInt']['input']>>>;
lt?: InputMaybe<Scalars['UnsignedInt']['input']>;
lte?: InputMaybe<Scalars['UnsignedInt']['input']>;
neq?: InputMaybe<Scalars['UnsignedInt']['input']>;
ngt?: InputMaybe<Scalars['UnsignedInt']['input']>;
ngte?: InputMaybe<Scalars['UnsignedInt']['input']>;
nin?: InputMaybe<Array<InputMaybe<Scalars['UnsignedInt']['input']>>>;
nlt?: InputMaybe<Scalars['UnsignedInt']['input']>;
nlte?: InputMaybe<Scalars['UnsignedInt']['input']>;
};
export type User = {
createdTime: Scalars['Instant']['output'];
disabled: Scalars['Boolean']['output'];
email: Scalars['String']['output'];
id: Scalars['UUID']['output'];
inviter: Maybe<User>;
lastUpdatedTime: Scalars['Instant']['output'];
oAuthProviderId: Scalars['String']['output'];
username: Scalars['String']['output'];
};
export type UuidOperationFilterInput = {
eq?: InputMaybe<Scalars['UUID']['input']>;
gt?: InputMaybe<Scalars['UUID']['input']>;
gte?: InputMaybe<Scalars['UUID']['input']>;
in?: InputMaybe<Array<InputMaybe<Scalars['UUID']['input']>>>;
lt?: InputMaybe<Scalars['UUID']['input']>;
lte?: InputMaybe<Scalars['UUID']['input']>;
neq?: InputMaybe<Scalars['UUID']['input']>;
ngt?: InputMaybe<Scalars['UUID']['input']>;
ngte?: InputMaybe<Scalars['UUID']['input']>;
nin?: InputMaybe<Array<InputMaybe<Scalars['UUID']['input']>>>;
nlt?: InputMaybe<Scalars['UUID']['input']>;
nlte?: InputMaybe<Scalars['UUID']['input']>;
};
export type NovelsQueryVariables = Exact<{
first?: InputMaybe<Scalars['Int']['input']>;
after?: InputMaybe<Scalars['String']['input']>;
}>;
export type NovelsQuery = { novels: { edges: Array<{ cursor: string, node: { id: any, url: string, name: { texts: Array<{ language: Language, text: string }> }, description: { texts: Array<{ language: Language, text: string }> }, coverImage: { originalPath: string, newPath: string | null } | null } }> | null, pageInfo: { hasNextPage: boolean, endCursor: string | null } } | null };
export const NovelsDocument = {"kind":"Document","definitions":[{"kind":"OperationDefinition","operation":"query","name":{"kind":"Name","value":"Novels"},"variableDefinitions":[{"kind":"VariableDefinition","variable":{"kind":"Variable","name":{"kind":"Name","value":"first"}},"type":{"kind":"NamedType","name":{"kind":"Name","value":"Int"}}},{"kind":"VariableDefinition","variable":{"kind":"Variable","name":{"kind":"Name","value":"after"}},"type":{"kind":"NamedType","name":{"kind":"Name","value":"String"}}}],"selectionSet":{"kind":"SelectionSet","selections":[{"kind":"Field","name":{"kind":"Name","value":"novels"},"arguments":[{"kind":"Argument","name":{"kind":"Name","value":"first"},"value":{"kind":"Variable","name":{"kind":"Name","value":"first"}}},{"kind":"Argument","name":{"kind":"Name","value":"after"},"value":{"kind":"Variable","name":{"kind":"Name","value":"after"}}}],"selectionSet":{"kind":"SelectionSet","selections":[{"kind":"Field","name":{"kind":"Name","value":"edges"},"selectionSet":{"kind":"SelectionSet","selections":[{"kind":"Field","name":{"kind":"Name","value":"cursor"}},{"kind":"Field","name":{"kind":"Name","value":"node"},"selectionSet":{"kind":"SelectionSet","selections":[{"kind":"Field","name":{"kind":"Name","value":"id"}},{"kind":"Field","name":{"kind":"Name","value":"url"}},{"kind":"Field","name":{"kind":"Name","value":"name"},"selectionSet":{"kind":"SelectionSet","selections":[{"kind":"Field","name":{"kind":"Name","value":"texts"},"selectionSet":{"kind":"SelectionSet","selections":[{"kind":"Field","name":{"kind":"Name","value":"language"}},{"kind":"Field","name":{"kind":"Name","value":"text"}}]}}]}},{"kind":"Field","name":{"kind":"Name","value":"description"},"selectionSet":{"kind":"SelectionSet","selections":[{"kind":"Field","name":{"kind":"Name","value":"texts"},"selectionSet":{"kind":"SelectionSet","selections":[{"kind":"Field","name":{"kind":"Name","value":"language"}},{"kind":"Field","name":{"kind":"Name","value":"text"}}]}}]}},{"kind":"Field","name":{"kind":"Name","value":"coverImage"},"selectionSet":{"kind":"SelectionSet","selections":[{"kind":"Field","name":{"kind":"Name","value":"originalPath"}},{"kind":"Field","name":{"kind":"Name","value":"newPath"}}]}}]}}]}},{"kind":"Field","name":{"kind":"Name","value":"pageInfo"},"selectionSet":{"kind":"SelectionSet","selections":[{"kind":"Field","name":{"kind":"Name","value":"hasNextPage"}},{"kind":"Field","name":{"kind":"Name","value":"endCursor"}}]}}]}}]}}]} as unknown as DocumentNode<NovelsQuery, NovelsQueryVariables>;

View File

@@ -6,7 +6,7 @@ const uri = import.meta.env.VITE_GRAPHQL_URI ?? 'https://localhost:5001/graphql'
const httpLink = new HttpLink({ uri }) const httpLink = new HttpLink({ uri })
const authLink = new SetContextLink(async ({ headers }, _) => { const authLink = new SetContextLink(async ({ headers }) => {
if (!userManager) return { headers } if (!userManager) return { headers }
try { try {
const user = await userManager.getUser() const user = await userManager.getUser()

View File

@@ -2,6 +2,29 @@ import { createContext, useCallback, useContext, useEffect, useMemo, useRef, use
import type { User } from 'oidc-client-ts' import type { User } from 'oidc-client-ts'
import { isOidcConfigured, userManager } from './oidcClient' import { isOidcConfigured, userManager } from './oidcClient'
// Cookie management helper functions
function setCookieFromUser(user: User) {
if (!user?.access_token) return
const isProduction = window.location.hostname !== 'localhost'
const domain = isProduction ? '.orfl.xyz' : undefined
const secure = isProduction
const sameSite = isProduction ? 'None' : 'Lax'
// Set cookie with JWT token from user
const cookieValue = `fa_session=${user.access_token}; path=/; ${secure ? 'secure; ' : ''}samesite=${sameSite}${domain ? `; domain=${domain}` : ''}`
document.cookie = cookieValue
}
function clearFaSessionCookie() {
const isProduction = window.location.hostname !== 'localhost'
const domain = isProduction ? '.orfl.xyz' : undefined
// Clear cookie by setting expiration date in the past
const cookieValue = `fa_session=; path=/; expires=Thu, 01 Jan 1970 00:00:00 GMT${domain ? `; domain=${domain}` : ''}`
document.cookie = cookieValue
}
type AuthContextValue = { type AuthContextValue = {
user: User | null user: User | null
isLoading: boolean isLoading: boolean
@@ -14,12 +37,11 @@ const AuthContext = createContext<AuthContextValue | undefined>(undefined)
export function AuthProvider({ children }: { children: ReactNode }) { export function AuthProvider({ children }: { children: ReactNode }) {
const [user, setUser] = useState<User | null>(null) const [user, setUser] = useState<User | null>(null)
const [isLoading, setIsLoading] = useState(true) const [isLoading, setIsLoading] = useState(!!userManager)
const callbackHandledRef = useRef(false) const callbackHandledRef = useRef(false)
useEffect(() => { useEffect(() => {
if (!userManager) { if (!userManager) {
setIsLoading(false)
return return
} }
@@ -27,7 +49,12 @@ export function AuthProvider({ children }: { children: ReactNode }) {
userManager userManager
.getUser() .getUser()
.then((loadedUser) => { .then((loadedUser) => {
if (!cancelled) setUser(loadedUser ?? null) if (!cancelled) {
setUser(loadedUser ?? null)
if (loadedUser) {
setCookieFromUser(loadedUser)
}
}
}) })
.finally(() => { .finally(() => {
if (!cancelled) setIsLoading(false) if (!cancelled) setIsLoading(false)
@@ -42,8 +69,14 @@ export function AuthProvider({ children }: { children: ReactNode }) {
const manager = userManager const manager = userManager
if (!manager) return if (!manager) return
const handleLoaded = (nextUser: User) => setUser(nextUser) const handleLoaded = (nextUser: User) => {
const handleUnloaded = () => setUser(null) setUser(nextUser)
setCookieFromUser(nextUser)
}
const handleUnloaded = () => {
setUser(null)
clearFaSessionCookie()
}
manager.events.addUserLoaded(handleLoaded) manager.events.addUserLoaded(handleLoaded)
manager.events.addUserUnloaded(handleUnloaded) manager.events.addUserUnloaded(handleUnloaded)
@@ -73,6 +106,9 @@ export function AuthProvider({ children }: { children: ReactNode }) {
.signinRedirectCallback() .signinRedirectCallback()
.then((nextUser) => { .then((nextUser) => {
setUser(nextUser ?? null) setUser(nextUser ?? null)
if (nextUser) {
setCookieFromUser(nextUser)
}
}) })
.catch((error) => { .catch((error) => {
console.error('Failed to complete sign-in redirect', error) console.error('Failed to complete sign-in redirect', error)
@@ -104,6 +140,7 @@ export function AuthProvider({ children }: { children: ReactNode }) {
console.error('Failed to sign out via redirect, clearing local session instead.', error) console.error('Failed to sign out via redirect, clearing local session instead.', error)
await manager.removeUser() await manager.removeUser()
setUser(null) setUser(null)
clearFaSessionCookie()
} }
}, []) }, [])
@@ -121,6 +158,7 @@ export function AuthProvider({ children }: { children: ReactNode }) {
return <AuthContext.Provider value={value}>{children}</AuthContext.Provider> return <AuthContext.Provider value={value}>{children}</AuthContext.Provider>
} }
// eslint-disable-next-line react-refresh/only-export-components
export function useAuth() { export function useAuth() {
const context = useContext(AuthContext) const context = useContext(AuthContext)
if (!context) { if (!context) {

View File

@@ -1,11 +1,13 @@
import type { Novel } from '../__generated__/graphql' import type { NovelsQuery } from '../__generated__/graphql'
import { Card, CardContent, CardHeader, CardTitle } from './ui/card' import { Card, CardContent, CardHeader, CardTitle } from './ui/card'
type NovelNode = NonNullable<NonNullable<NovelsQuery['novels']>['edges']>[number]['node']
type NovelCardProps = { type NovelCardProps = {
novel: Novel novel: NovelNode
} }
function pickText(novelText?: Novel['name'] | Novel['description']) { function pickText(novelText?: NovelNode['name'] | NovelNode['description']) {
const texts = novelText?.texts ?? [] const texts = novelText?.texts ?? []
const english = texts.find((t) => t.language === 'EN') const english = texts.find((t) => t.language === 'EN')
return (english ?? texts[0])?.text ?? 'No description available.' return (english ?? texts[0])?.text ?? 'No description available.'

View File

@@ -31,4 +31,5 @@ function Badge({ className, variant, ...props }: BadgeProps) {
) )
} }
// eslint-disable-next-line react-refresh/only-export-components
export { Badge, badgeVariants } export { Badge, badgeVariants }

View File

@@ -51,4 +51,5 @@ const Button = React.forwardRef<HTMLButtonElement, ButtonProps>(
) )
Button.displayName = 'Button' Button.displayName = 'Button'
// eslint-disable-next-line react-refresh/only-export-components
export { Button, buttonVariants } export { Button, buttonVariants }

View File

@@ -2,8 +2,7 @@ import * as React from 'react'
import { cn } from '../../lib/utils' import { cn } from '../../lib/utils'
export interface InputProps export type InputProps = React.InputHTMLAttributes<HTMLInputElement>
extends React.InputHTMLAttributes<HTMLInputElement> {}
const Input = React.forwardRef<HTMLInputElement, InputProps>( const Input = React.forwardRef<HTMLInputElement, InputProps>(
({ className, type, ...props }, ref) => { ({ className, type, ...props }, ref) => {

View File

@@ -1,6 +1,7 @@
import { useMemo } from 'react' import { useMemo } from 'react'
import { useNovelsQuery } from '../__generated__/graphql' import { useQuery } from '@apollo/client/react'
import { NovelsDocument } from '../__generated__/graphql'
import { NovelCard } from '../components/NovelCard' import { NovelCard } from '../components/NovelCard'
import { Button } from '../components/ui/button' import { Button } from '../components/ui/button'
import { Card, CardContent, CardHeader, CardTitle } from '../components/ui/card' import { Card, CardContent, CardHeader, CardTitle } from '../components/ui/card'
@@ -8,19 +9,18 @@ import { Card, CardContent, CardHeader, CardTitle } from '../components/ui/card'
const PAGE_SIZE = 12 const PAGE_SIZE = 12
export function NovelsPage() { export function NovelsPage() {
const { data, loading, error, fetchMore } = useNovelsQuery({ const { data, loading, error, fetchMore } = useQuery(NovelsDocument, {
variables: { first: PAGE_SIZE, after: null }, variables: { first: PAGE_SIZE, after: null },
notifyOnNetworkStatusChange: true, notifyOnNetworkStatusChange: true,
}) })
const edges = data?.novels?.edges ?? []
const pageInfo = data?.novels?.pageInfo const pageInfo = data?.novels?.pageInfo
const hasNextPage = pageInfo?.hasNextPage ?? false const hasNextPage = pageInfo?.hasNextPage ?? false
const endCursor = pageInfo?.endCursor ?? null const endCursor = pageInfo?.endCursor ?? null
const novels = useMemo( const novels = useMemo(
() => edges.map((edge) => edge?.node).filter(Boolean), () => (data?.novels?.edges ?? []).map((edge) => edge?.node).filter(Boolean),
[edges] [data?.novels?.edges]
) )
async function handleLoadMore() { async function handleLoadMore() {