Compare commits

..

33 Commits

Author SHA1 Message Date
gamer147
15e1a84f55 [FA-27] Update CICD
All checks were successful
CI / build-backend (pull_request) Successful in 1m6s
CI / build-frontend (pull_request) Successful in 41s
2026-01-19 17:03:44 -05:00
gamer147
70d4ba201a [FA-27] Fix unit test based on changes
All checks were successful
CI / build-backend (pull_request) Successful in 1m10s
CI / build-frontend (pull_request) Successful in 43s
2026-01-19 16:47:55 -05:00
gamer147
b69bcd6bf4 [FA-27] Fix user adding not using correct id
Some checks failed
CI / build-backend (pull_request) Failing after 1m2s
CI / build-frontend (pull_request) Successful in 41s
2026-01-19 16:14:49 -05:00
gamer147
c97654631b [FA-27] Still need to test events 2026-01-19 15:40:21 -05:00
gamer147
1ecfd9cc99 [FA-27] Need to test events but seems to mostly work 2026-01-19 15:13:14 -05:00
gamer147
19ae4a8089 Add .worktrees/ to .gitignore 2026-01-19 01:36:10 -05:00
gamer147
f8a45ad891 [FA-27] Bookmark implementation 2026-01-19 00:01:16 -05:00
gamer147
f67c5c610c Merge branch 'refs/heads/master' into feature/FA-27_Bookmarks 2025-12-30 11:07:36 -05:00
b5d4694f12 Merge pull request '[FA-misc] Update docker-compose.yml' (#58) from feature/FA-misc_AddDockerComposeUserService into master
All checks were successful
CI / build-backend (push) Successful in 1m7s
CI / build-frontend (push) Successful in 40s
Build Gateway / build-subgraphs (map[name:novel-service project:FictionArchive.Service.NovelService subgraph:Novel]) (push) Successful in 55s
Build Gateway / build-subgraphs (map[name:scheduler-service project:FictionArchive.Service.SchedulerService subgraph:Scheduler]) (push) Successful in 49s
Build Gateway / build-subgraphs (map[name:translation-service project:FictionArchive.Service.TranslationService subgraph:Translation]) (push) Successful in 50s
Build Gateway / build-subgraphs (map[name:user-service project:FictionArchive.Service.UserService subgraph:User]) (push) Successful in 48s
Release / build-and-push (map[dockerfile:FictionArchive.Service.AuthenticationService/Dockerfile name:authentication-service]) (push) Successful in 2m25s
Release / build-and-push (map[dockerfile:FictionArchive.Service.FileService/Dockerfile name:file-service]) (push) Successful in 2m28s
Release / build-and-push (map[dockerfile:FictionArchive.Service.NovelService/Dockerfile name:novel-service]) (push) Successful in 2m14s
Release / build-and-push (map[dockerfile:FictionArchive.Service.SchedulerService/Dockerfile name:scheduler-service]) (push) Successful in 2m8s
Release / build-and-push (map[dockerfile:FictionArchive.Service.TranslationService/Dockerfile name:translation-service]) (push) Successful in 2m15s
Release / build-and-push (map[dockerfile:FictionArchive.Service.UserService/Dockerfile name:user-service]) (push) Successful in 1m43s
Release / build-frontend (push) Successful in 1m43s
Build Gateway / build-gateway (push) Successful in 4m1s
Reviewed-on: #58
2025-12-30 03:26:06 +00:00
gamer147
6d47153a42 [FA-misc] Update docker-compose.yml
All checks were successful
CI / build-backend (pull_request) Successful in 1m26s
CI / build-frontend (pull_request) Successful in 50s
2025-12-29 22:23:29 -05:00
dbbc2fd8dc Merge pull request 'feature/FA-6_AuthorsPosts' (#57) from feature/FA-6_AuthorsPosts into master
All checks were successful
CI / build-backend (push) Successful in 1m16s
CI / build-frontend (push) Successful in 51s
Reviewed-on: #57
2025-12-30 03:14:53 +00:00
gamer147
176c94297b [FA-6] Author's posts seem to work
All checks were successful
CI / build-backend (pull_request) Successful in 2m4s
CI / build-frontend (pull_request) Successful in 46s
2025-12-29 22:06:12 -05:00
gamer147
8b3faa8f6c [FA-6] Good spot 2025-12-29 21:40:44 -05:00
gamer147
d87bd81190 [FA-6] Volumes work probably? 2025-12-29 21:28:07 -05:00
gamer147
bee805c441 [FA-6] Need to test Novelpia import 2025-12-29 20:27:04 -05:00
gamer147
5013da69c2 [FA-27] UserNovelDataService bootstrapped, going to do author's posts first i think 2025-12-29 14:54:01 -05:00
d8e3ec7ec9 Merge pull request 'feature/FA-55_UserServiceSetup' (#56) from feature/FA-55_UserServiceSetup into master
Some checks failed
CI / build-backend (push) Successful in 1m9s
CI / build-frontend (push) Failing after 44s
Reviewed-on: #56
2025-12-29 19:38:43 +00:00
gamer147
3612c89b99 [FA-55] Resolve linter error
All checks were successful
CI / build-backend (pull_request) Successful in 1m7s
CI / build-frontend (pull_request) Successful in 42s
2025-12-29 14:35:17 -05:00
gamer147
ebb2e6e7fc [FA-55] User service should be done
Some checks failed
CI / build-backend (pull_request) Successful in 2m2s
CI / build-frontend (pull_request) Failing after 30s
2025-12-29 14:33:08 -05:00
gamer147
01d3b94050 [FA-55] Finished aside from deactivation/integration events 2025-12-29 14:09:41 -05:00
gamer147
c0290cc5af [FA-55] User Service backend initial setup 2025-12-29 11:20:23 -05:00
1d950b7721 Merge pull request '[FA-misc] Whoops' (#53) from hotfix/FA-misc_LintFix into master
All checks were successful
CI / build-backend (push) Successful in 56s
CI / build-frontend (push) Successful in 39s
Build Gateway / build-subgraphs (map[name:novel-service project:FictionArchive.Service.NovelService subgraph:Novel]) (push) Successful in 49s
Build Gateway / build-subgraphs (map[name:scheduler-service project:FictionArchive.Service.SchedulerService subgraph:Scheduler]) (push) Successful in 49s
Build Gateway / build-subgraphs (map[name:translation-service project:FictionArchive.Service.TranslationService subgraph:Translation]) (push) Successful in 49s
Build Gateway / build-subgraphs (map[name:user-service project:FictionArchive.Service.UserService subgraph:User]) (push) Successful in 46s
Release / build-and-push (map[dockerfile:FictionArchive.Service.AuthenticationService/Dockerfile name:authentication-service]) (push) Successful in 3m54s
Release / build-and-push (map[dockerfile:FictionArchive.Service.FileService/Dockerfile name:file-service]) (push) Successful in 2m10s
Release / build-and-push (map[dockerfile:FictionArchive.Service.NovelService/Dockerfile name:novel-service]) (push) Successful in 1m56s
Release / build-and-push (map[dockerfile:FictionArchive.Service.SchedulerService/Dockerfile name:scheduler-service]) (push) Successful in 1m58s
Release / build-and-push (map[dockerfile:FictionArchive.Service.TranslationService/Dockerfile name:translation-service]) (push) Successful in 1m58s
Release / build-and-push (map[dockerfile:FictionArchive.Service.UserService/Dockerfile name:user-service]) (push) Successful in 3m28s
Release / build-frontend (push) Successful in 1m38s
Build Gateway / build-gateway (push) Successful in 4m7s
Reviewed-on: #53
2025-12-11 20:21:15 +00:00
gamer147
7738bcf438 [FA-misc] Whoops
Some checks failed
CI / build-frontend (pull_request) Has been cancelled
CI / build-backend (pull_request) Has been cancelled
2025-12-11 15:21:04 -05:00
61e0cb69d8 Merge pull request '[FA-misc] Add delete button' (#52) from hotfix/FA-misc_FixNovelRedownloads into master
Some checks failed
CI / build-backend (push) Successful in 1m3s
CI / build-frontend (push) Failing after 26s
Build Gateway / build-subgraphs (map[name:novel-service project:FictionArchive.Service.NovelService subgraph:Novel]) (push) Successful in 46s
Build Gateway / build-subgraphs (map[name:scheduler-service project:FictionArchive.Service.SchedulerService subgraph:Scheduler]) (push) Successful in 52s
Build Gateway / build-subgraphs (map[name:translation-service project:FictionArchive.Service.TranslationService subgraph:Translation]) (push) Successful in 46s
Build Gateway / build-subgraphs (map[name:user-service project:FictionArchive.Service.UserService subgraph:User]) (push) Successful in 43s
Release / build-and-push (map[dockerfile:FictionArchive.Service.AuthenticationService/Dockerfile name:authentication-service]) (push) Successful in 2m3s
Release / build-and-push (map[dockerfile:FictionArchive.Service.FileService/Dockerfile name:file-service]) (push) Successful in 2m54s
Release / build-and-push (map[dockerfile:FictionArchive.Service.NovelService/Dockerfile name:novel-service]) (push) Successful in 1m45s
Release / build-and-push (map[dockerfile:FictionArchive.Service.SchedulerService/Dockerfile name:scheduler-service]) (push) Successful in 1m42s
Release / build-and-push (map[dockerfile:FictionArchive.Service.TranslationService/Dockerfile name:translation-service]) (push) Successful in 1m50s
Release / build-and-push (map[dockerfile:FictionArchive.Service.UserService/Dockerfile name:user-service]) (push) Successful in 1m53s
Build Gateway / build-gateway (push) Has been cancelled
Release / build-frontend (push) Has been cancelled
Reviewed-on: #52
2025-12-11 20:01:43 +00:00
gamer147
02525d611a [FA-misc] Add delete button
Some checks failed
CI / build-backend (pull_request) Successful in 2m35s
CI / build-frontend (pull_request) Failing after 27s
2025-12-11 15:00:55 -05:00
c21fe0fbd5 Merge pull request '[FA-misc] Fix an oversight in the update process' (#51) from feature/FA-misc_NovelpiaResiliency into master
Some checks failed
CI / build-backend (push) Successful in 1m2s
CI / build-frontend (push) Successful in 41s
Build Gateway / build-subgraphs (map[name:novel-service project:FictionArchive.Service.NovelService subgraph:Novel]) (push) Successful in 1m1s
Build Gateway / build-subgraphs (map[name:scheduler-service project:FictionArchive.Service.SchedulerService subgraph:Scheduler]) (push) Successful in 48s
Build Gateway / build-subgraphs (map[name:translation-service project:FictionArchive.Service.TranslationService subgraph:Translation]) (push) Successful in 45s
Build Gateway / build-subgraphs (map[name:user-service project:FictionArchive.Service.UserService subgraph:User]) (push) Successful in 42s
Release / build-and-push (map[dockerfile:FictionArchive.Service.AuthenticationService/Dockerfile name:authentication-service]) (push) Successful in 2m12s
Release / build-and-push (map[dockerfile:FictionArchive.Service.FileService/Dockerfile name:file-service]) (push) Successful in 2m3s
Release / build-and-push (map[dockerfile:FictionArchive.Service.NovelService/Dockerfile name:novel-service]) (push) Successful in 1m48s
Release / build-and-push (map[dockerfile:FictionArchive.Service.SchedulerService/Dockerfile name:scheduler-service]) (push) Successful in 1m44s
Release / build-and-push (map[dockerfile:FictionArchive.Service.TranslationService/Dockerfile name:translation-service]) (push) Successful in 1m58s
Release / build-and-push (map[dockerfile:FictionArchive.Service.UserService/Dockerfile name:user-service]) (push) Successful in 1m55s
Release / build-frontend (push) Failing after 59s
Build Gateway / build-gateway (push) Successful in 3m38s
Reviewed-on: #51
2025-12-11 19:16:39 +00:00
gamer147
bbc0b5ec7d [FA-misc] Fix an oversight in the update process
All checks were successful
CI / build-backend (pull_request) Successful in 1m28s
CI / build-frontend (pull_request) Successful in 53s
2025-12-11 14:16:21 -05:00
5527c15ae7 Merge pull request '[FA-misc] Adds standard Polly Resiliency to Novelpia Http Clients' (#50) from feature/FA-misc_NovelpiaResiliency into master
All checks were successful
CI / build-backend (push) Successful in 1m0s
CI / build-frontend (push) Successful in 41s
Build Gateway / build-subgraphs (map[name:novel-service project:FictionArchive.Service.NovelService subgraph:Novel]) (push) Successful in 51s
Build Gateway / build-subgraphs (map[name:scheduler-service project:FictionArchive.Service.SchedulerService subgraph:Scheduler]) (push) Successful in 48s
Build Gateway / build-subgraphs (map[name:translation-service project:FictionArchive.Service.TranslationService subgraph:Translation]) (push) Successful in 49s
Build Gateway / build-subgraphs (map[name:user-service project:FictionArchive.Service.UserService subgraph:User]) (push) Successful in 52s
Release / build-and-push (map[dockerfile:FictionArchive.Service.AuthenticationService/Dockerfile name:authentication-service]) (push) Successful in 2m23s
Release / build-and-push (map[dockerfile:FictionArchive.Service.FileService/Dockerfile name:file-service]) (push) Successful in 2m24s
Release / build-and-push (map[dockerfile:FictionArchive.Service.NovelService/Dockerfile name:novel-service]) (push) Successful in 1m43s
Release / build-and-push (map[dockerfile:FictionArchive.Service.SchedulerService/Dockerfile name:scheduler-service]) (push) Successful in 1m38s
Release / build-and-push (map[dockerfile:FictionArchive.Service.TranslationService/Dockerfile name:translation-service]) (push) Successful in 1m51s
Release / build-and-push (map[dockerfile:FictionArchive.Service.UserService/Dockerfile name:user-service]) (push) Successful in 1m34s
Release / build-frontend (push) Successful in 1m33s
Build Gateway / build-gateway (push) Successful in 4m1s
Reviewed-on: #50
2025-12-11 14:54:12 +00:00
gamer147
1e374e6eeb [FA-misc] Adds standard Polly Resiliency to Novelpia Http Clients
All checks were successful
CI / build-backend (pull_request) Successful in 1m28s
CI / build-frontend (pull_request) Successful in 46s
2025-12-11 09:53:54 -05:00
c710f14257 Merge pull request '[FA-misc] Page title updates' (#49) from feature/FA-misc_UIUpdates into master
All checks were successful
CI / build-backend (push) Successful in 1m2s
CI / build-frontend (push) Successful in 40s
Build Gateway / build-subgraphs (map[name:novel-service project:FictionArchive.Service.NovelService subgraph:Novel]) (push) Successful in 48s
Build Gateway / build-subgraphs (map[name:scheduler-service project:FictionArchive.Service.SchedulerService subgraph:Scheduler]) (push) Successful in 45s
Build Gateway / build-subgraphs (map[name:translation-service project:FictionArchive.Service.TranslationService subgraph:Translation]) (push) Successful in 47s
Build Gateway / build-subgraphs (map[name:user-service project:FictionArchive.Service.UserService subgraph:User]) (push) Successful in 43s
Release / build-and-push (map[dockerfile:FictionArchive.Service.AuthenticationService/Dockerfile name:authentication-service]) (push) Successful in 2m17s
Release / build-and-push (map[dockerfile:FictionArchive.Service.FileService/Dockerfile name:file-service]) (push) Successful in 2m10s
Release / build-and-push (map[dockerfile:FictionArchive.Service.NovelService/Dockerfile name:novel-service]) (push) Successful in 1m56s
Release / build-and-push (map[dockerfile:FictionArchive.Service.SchedulerService/Dockerfile name:scheduler-service]) (push) Successful in 1m52s
Release / build-and-push (map[dockerfile:FictionArchive.Service.TranslationService/Dockerfile name:translation-service]) (push) Successful in 1m54s
Release / build-and-push (map[dockerfile:FictionArchive.Service.UserService/Dockerfile name:user-service]) (push) Successful in 1m47s
Release / build-frontend (push) Successful in 1m41s
Build Gateway / build-gateway (push) Successful in 3m41s
Reviewed-on: #49
2025-12-11 12:42:33 +00:00
gamer147
6c10077505 [FA-misc] Page title updates
All checks were successful
CI / build-backend (pull_request) Successful in 1m45s
CI / build-frontend (pull_request) Successful in 46s
2025-12-11 07:42:08 -05:00
fecb3e6f43 Merge pull request '[FA-misc] Various UI updates' (#48) from feature/FA-misc_UIUpdates into master
All checks were successful
CI / build-backend (push) Successful in 1m0s
CI / build-frontend (push) Successful in 37s
Build Gateway / build-subgraphs (map[name:novel-service project:FictionArchive.Service.NovelService subgraph:Novel]) (push) Successful in 46s
Build Gateway / build-subgraphs (map[name:scheduler-service project:FictionArchive.Service.SchedulerService subgraph:Scheduler]) (push) Successful in 43s
Build Gateway / build-subgraphs (map[name:translation-service project:FictionArchive.Service.TranslationService subgraph:Translation]) (push) Successful in 46s
Build Gateway / build-subgraphs (map[name:user-service project:FictionArchive.Service.UserService subgraph:User]) (push) Successful in 44s
Release / build-and-push (map[dockerfile:FictionArchive.Service.AuthenticationService/Dockerfile name:authentication-service]) (push) Successful in 2m11s
Release / build-and-push (map[dockerfile:FictionArchive.Service.FileService/Dockerfile name:file-service]) (push) Successful in 2m2s
Release / build-and-push (map[dockerfile:FictionArchive.Service.NovelService/Dockerfile name:novel-service]) (push) Successful in 1m48s
Release / build-and-push (map[dockerfile:FictionArchive.Service.SchedulerService/Dockerfile name:scheduler-service]) (push) Successful in 1m57s
Release / build-and-push (map[dockerfile:FictionArchive.Service.TranslationService/Dockerfile name:translation-service]) (push) Successful in 1m50s
Release / build-and-push (map[dockerfile:FictionArchive.Service.UserService/Dockerfile name:user-service]) (push) Successful in 1m40s
Release / build-frontend (push) Successful in 1m37s
Build Gateway / build-gateway (push) Successful in 3m56s
Reviewed-on: #48
2025-12-11 01:37:42 +00:00
gamer147
f0ea71e00e [FA-misc] Various UI updates
All checks were successful
CI / build-backend (pull_request) Successful in 1m38s
CI / build-frontend (pull_request) Successful in 37s
2025-12-10 20:37:30 -05:00
127 changed files with 15716 additions and 434 deletions

View File

@@ -28,6 +28,9 @@ jobs:
- name: user-service
project: FictionArchive.Service.UserService
subgraph: User
- name: usernoveldata-service
project: FictionArchive.Service.UserNovelDataService
subgraph: UserNovelData
steps:
- name: Checkout
uses: actions/checkout@v4
@@ -110,6 +113,12 @@ jobs:
name: user-service-subgraph
path: subgraphs/user
- name: Download UserNovelData Service subgraph
uses: christopherhx/gitea-download-artifact@v4
with:
name: usernoveldata-service-subgraph
path: subgraphs/usernoveldata
- name: Configure subgraph URLs for Docker
run: |
for fsp in subgraphs/*/*.fsp; do

View File

@@ -27,6 +27,8 @@ jobs:
dockerfile: FictionArchive.Service.SchedulerService/Dockerfile
- name: authentication-service
dockerfile: FictionArchive.Service.AuthenticationService/Dockerfile
- name: usernoveldata-service
dockerfile: FictionArchive.Service.UserNovelDataService/Dockerfile
steps:
- name: Checkout
uses: actions/checkout@v4

3
.gitignore vendored
View File

@@ -140,3 +140,6 @@ appsettings.Local.json
schema.graphql
*.fsp
gateway.fgp
# Git worktrees
.worktrees/

File diff suppressed because it is too large Load Diff

View File

@@ -42,6 +42,13 @@ public class NovelUpdateServiceTests
Images = new List<Image>()
};
var volume = new Volume
{
Order = 1,
Name = LocalizationKey.CreateFromText("Main Story", Language.En),
Chapters = new List<Chapter> { chapter }
};
var novel = new Novel
{
Url = "http://demo/novel",
@@ -52,14 +59,14 @@ public class NovelUpdateServiceTests
Source = source,
Name = LocalizationKey.CreateFromText("Demo Novel", Language.En),
Description = LocalizationKey.CreateFromText("Description", Language.En),
Chapters = new List<Chapter> { chapter },
Volumes = new List<Volume> { volume },
Tags = new List<NovelTag>()
};
dbContext.Novels.Add(novel);
dbContext.SaveChanges();
return new NovelCreateResult(novel, chapter);
return new NovelCreateResult(novel, volume, chapter);
}
private static NovelUpdateService CreateService(
@@ -81,7 +88,7 @@ public class NovelUpdateServiceTests
{
using var dbContext = CreateDbContext();
var source = new Source { Name = "Demo", Key = "demo", Url = "http://demo" };
var (novel, chapter) = CreateNovelWithSingleChapter(dbContext, source);
var (novel, volume, chapter) = CreateNovelWithSingleChapter(dbContext, source);
var rawHtml = "<p>Hello</p><img src=\"http://img/x1.jpg\" alt=\"first\" /><img src=\"http://img/x2.jpg\" alt=\"second\" />";
var image1 = new ImageData { Url = "http://img/x1.jpg", Data = new byte[] { 1, 2, 3 } };
@@ -103,7 +110,7 @@ public class NovelUpdateServiceTests
var pendingImageUrl = "https://pending/placeholder.jpg";
var service = CreateService(dbContext, adapter, eventBus, pendingImageUrl);
var updatedChapter = await service.PullChapterContents(novel.Id, chapter.Order);
var updatedChapter = await service.PullChapterContents(novel.Id, volume.Id, chapter.Order);
updatedChapter.Images.Should().HaveCount(2);
updatedChapter.Images.Select(i => i.OriginalPath).Should().BeEquivalentTo(new[] { image1.Url, image2.Url });
@@ -131,7 +138,7 @@ public class NovelUpdateServiceTests
{
using var dbContext = CreateDbContext();
var source = new Source { Name = "Demo", Key = "demo", Url = "http://demo" };
var (novel, chapter) = CreateNovelWithSingleChapter(dbContext, source);
var (novel, volume, chapter) = CreateNovelWithSingleChapter(dbContext, source);
var rawHtml = "<p>Hi</p><img src=\"http://img/x1.jpg\">";
var image = new ImageData { Url = "http://img/x1.jpg", Data = new byte[] { 7, 8, 9 } };
@@ -150,7 +157,7 @@ public class NovelUpdateServiceTests
var service = CreateService(dbContext, adapter, eventBus);
var updatedChapter = await service.PullChapterContents(novel.Id, chapter.Order);
var updatedChapter = await service.PullChapterContents(novel.Id, volume.Id, chapter.Order);
var storedHtml = updatedChapter.Body.Texts.Single().Text;
var doc = new HtmlDocument();
@@ -161,7 +168,7 @@ public class NovelUpdateServiceTests
imgNode.GetAttributeValue("src", string.Empty).Should().Be("https://pending/placeholder.jpg");
}
private record NovelCreateResult(Novel Novel, Chapter Chapter);
private record NovelCreateResult(Novel Novel, Volume Volume, Chapter Chapter);
#region UpdateImage Tests
@@ -199,7 +206,7 @@ public class NovelUpdateServiceTests
// Arrange
using var dbContext = CreateDbContext();
var source = new Source { Name = "Demo", Key = "demo", Url = "http://demo" };
var (novel, chapter) = CreateNovelWithSingleChapter(dbContext, source);
var (novel, _, chapter) = CreateNovelWithSingleChapter(dbContext, source);
var image = new Image
{
@@ -252,7 +259,7 @@ public class NovelUpdateServiceTests
// Arrange
using var dbContext = CreateDbContext();
var source = new Source { Name = "Demo", Key = "demo", Url = "http://demo" };
var (novel, chapter) = CreateNovelWithSingleChapter(dbContext, source);
var (_, _, chapter) = CreateNovelWithSingleChapter(dbContext, source);
var image1 = new Image { OriginalPath = "http://original/img1.jpg", Chapter = chapter };
var image2 = new Image { OriginalPath = "http://original/img2.jpg", Chapter = chapter };

View File

@@ -7,6 +7,7 @@ using FictionArchive.Service.NovelService.Services;
using FictionArchive.Service.NovelService.Services.SourceAdapters;
using FictionArchive.Service.Shared.Services.EventBus;
using HotChocolate.Authorization;
using HotChocolate.Types;
using Microsoft.EntityFrameworkCore;
namespace FictionArchive.Service.NovelService.GraphQL;
@@ -20,10 +21,20 @@ public class Mutation
}
[Authorize]
public async Task<ChapterPullRequestedEvent> FetchChapterContents(uint novelId,
uint chapterNumber,
public async Task<ChapterPullRequestedEvent> FetchChapterContents(
uint novelId,
uint volumeId,
uint chapterOrder,
NovelUpdateService service)
{
return await service.QueueChapterPull(novelId, chapterNumber);
return await service.QueueChapterPull(novelId, volumeId, chapterOrder);
}
[Error<KeyNotFoundException>]
[Authorize]
public async Task<bool> DeleteNovel(uint novelId, NovelUpdateService service)
{
await service.DeleteNovel(novelId);
return true;
}
}

View File

@@ -77,7 +77,19 @@ public class Query
}
: null,
Chapters = novel.Chapters.Select(chapter => new ChapterDto
Volumes = novel.Volumes.OrderBy(v => v.Order).Select(volume => new VolumeDto
{
Id = volume.Id,
CreatedTime = volume.CreatedTime,
LastUpdatedTime = volume.LastUpdatedTime,
Order = volume.Order,
Name = volume.Name.Texts
.Where(t => t.Language == preferredLanguage)
.Select(t => t.Text)
.FirstOrDefault()
?? volume.Name.Texts.Select(t => t.Text).FirstOrDefault()
?? "",
Chapters = volume.Chapters.OrderBy(c => c.Order).Select(chapter => new ChapterDto
{
Id = chapter.Id,
CreatedTime = chapter.CreatedTime,
@@ -104,6 +116,7 @@ public class Query
LastUpdatedTime = image.LastUpdatedTime,
NewPath = image.NewPath
}).ToList()
}).ToList()
}).ToList(),
Tags = novel.Tags.Select(tag => new NovelTagDto
@@ -140,11 +153,12 @@ public class Query
public IQueryable<ChapterReaderDto> GetChapter(
NovelServiceDbContext dbContext,
uint novelId,
uint volumeOrder,
uint chapterOrder,
Language preferredLanguage = Language.En)
{
return dbContext.Chapters
.Where(c => c.Novel.Id == novelId && c.Order == chapterOrder)
.Where(c => c.Volume.Novel.Id == novelId && c.Volume.Order == volumeOrder && c.Order == chapterOrder)
.Select(chapter => new ChapterReaderDto
{
Id = chapter.Id,
@@ -176,22 +190,72 @@ public class Query
NewPath = image.NewPath
}).ToList(),
NovelId = chapter.Novel.Id,
NovelName = chapter.Novel.Name.Texts
NovelId = chapter.Volume.Novel.Id,
NovelName = chapter.Volume.Novel.Name.Texts
.Where(t => t.Language == preferredLanguage)
.Select(t => t.Text)
.FirstOrDefault()
?? chapter.Novel.Name.Texts.Select(t => t.Text).FirstOrDefault()
?? chapter.Volume.Novel.Name.Texts.Select(t => t.Text).FirstOrDefault()
?? "",
TotalChapters = chapter.Novel.Chapters.Count,
PrevChapterOrder = chapter.Novel.Chapters
// Volume context
VolumeId = chapter.Volume.Id,
VolumeName = chapter.Volume.Name.Texts
.Where(t => t.Language == preferredLanguage)
.Select(t => t.Text)
.FirstOrDefault()
?? chapter.Volume.Name.Texts.Select(t => t.Text).FirstOrDefault()
?? "",
VolumeOrder = chapter.Volume.Order,
TotalChaptersInVolume = chapter.Volume.Chapters.Count,
// Previous chapter: first try same volume, then last chapter of previous volume
PrevChapterVolumeOrder = chapter.Volume.Chapters
.Where(c => c.Order < chapterOrder)
.OrderByDescending(c => c.Order)
.Select(c => (int?)chapter.Volume.Order)
.FirstOrDefault()
?? chapter.Volume.Novel.Volumes
.Where(v => v.Order < chapter.Volume.Order)
.OrderByDescending(v => v.Order)
.SelectMany(v => v.Chapters.OrderByDescending(c => c.Order).Take(1))
.Select(c => (int?)c.Volume.Order)
.FirstOrDefault(),
PrevChapterOrder = chapter.Volume.Chapters
.Where(c => c.Order < chapterOrder)
.OrderByDescending(c => c.Order)
.Select(c => (uint?)c.Order)
.FirstOrDefault()
?? chapter.Volume.Novel.Volumes
.Where(v => v.Order < chapter.Volume.Order)
.OrderByDescending(v => v.Order)
.SelectMany(v => v.Chapters.OrderByDescending(c => c.Order).Take(1))
.Select(c => (uint?)c.Order)
.FirstOrDefault(),
NextChapterOrder = chapter.Novel.Chapters
// Next chapter: first try same volume, then first chapter of next volume
NextChapterVolumeOrder = chapter.Volume.Chapters
.Where(c => c.Order > chapterOrder)
.OrderBy(c => c.Order)
.Select(c => (int?)chapter.Volume.Order)
.FirstOrDefault()
?? chapter.Volume.Novel.Volumes
.Where(v => v.Order > chapter.Volume.Order)
.OrderBy(v => v.Order)
.SelectMany(v => v.Chapters.OrderBy(c => c.Order).Take(1))
.Select(c => (int?)c.Volume.Order)
.FirstOrDefault(),
NextChapterOrder = chapter.Volume.Chapters
.Where(c => c.Order > chapterOrder)
.OrderBy(c => c.Order)
.Select(c => (uint?)c.Order)
.FirstOrDefault()
?? chapter.Volume.Novel.Volumes
.Where(v => v.Order > chapter.Volume.Order)
.OrderBy(v => v.Order)
.SelectMany(v => v.Chapters.OrderBy(c => c.Order).Take(1))
.Select(c => (uint?)c.Order)
.FirstOrDefault()
});

View File

@@ -0,0 +1,605 @@
// <auto-generated />
using System;
using FictionArchive.Service.NovelService.Services;
using Microsoft.EntityFrameworkCore;
using Microsoft.EntityFrameworkCore.Infrastructure;
using Microsoft.EntityFrameworkCore.Migrations;
using Microsoft.EntityFrameworkCore.Storage.ValueConversion;
using NodaTime;
using Npgsql.EntityFrameworkCore.PostgreSQL.Metadata;
#nullable disable
namespace FictionArchive.Service.NovelService.Migrations
{
[DbContext(typeof(NovelServiceDbContext))]
[Migration("20251229203027_AddVolumes")]
partial class AddVolumes
{
/// <inheritdoc />
protected override void BuildTargetModel(ModelBuilder modelBuilder)
{
#pragma warning disable 612, 618
modelBuilder
.HasAnnotation("ProductVersion", "9.0.11")
.HasAnnotation("Relational:MaxIdentifierLength", 63);
NpgsqlModelBuilderExtensions.UseIdentityByDefaultColumns(modelBuilder);
modelBuilder.Entity("FictionArchive.Service.NovelService.Models.Images.Image", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uuid");
b.Property<long?>("ChapterId")
.HasColumnType("bigint");
b.Property<Instant>("CreatedTime")
.HasColumnType("timestamp with time zone");
b.Property<Instant>("LastUpdatedTime")
.HasColumnType("timestamp with time zone");
b.Property<string>("NewPath")
.HasColumnType("text");
b.Property<string>("OriginalPath")
.IsRequired()
.HasColumnType("text");
b.HasKey("Id");
b.HasIndex("ChapterId");
b.ToTable("Images");
});
modelBuilder.Entity("FictionArchive.Service.NovelService.Models.Localization.LocalizationKey", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uuid");
b.Property<Instant>("CreatedTime")
.HasColumnType("timestamp with time zone");
b.Property<Instant>("LastUpdatedTime")
.HasColumnType("timestamp with time zone");
b.HasKey("Id");
b.ToTable("LocalizationKeys");
});
modelBuilder.Entity("FictionArchive.Service.NovelService.Models.Localization.LocalizationRequest", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uuid");
b.Property<Instant>("CreatedTime")
.HasColumnType("timestamp with time zone");
b.Property<long>("EngineId")
.HasColumnType("bigint");
b.Property<Guid>("KeyRequestedForTranslationId")
.HasColumnType("uuid");
b.Property<Instant>("LastUpdatedTime")
.HasColumnType("timestamp with time zone");
b.Property<int>("TranslateTo")
.HasColumnType("integer");
b.HasKey("Id");
b.HasIndex("EngineId");
b.HasIndex("KeyRequestedForTranslationId");
b.ToTable("LocalizationRequests");
});
modelBuilder.Entity("FictionArchive.Service.NovelService.Models.Localization.LocalizationText", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uuid");
b.Property<Instant>("CreatedTime")
.HasColumnType("timestamp with time zone");
b.Property<int>("Language")
.HasColumnType("integer");
b.Property<Instant>("LastUpdatedTime")
.HasColumnType("timestamp with time zone");
b.Property<Guid?>("LocalizationKeyId")
.HasColumnType("uuid");
b.Property<string>("Text")
.IsRequired()
.HasColumnType("text");
b.Property<long?>("TranslationEngineId")
.HasColumnType("bigint");
b.HasKey("Id");
b.HasIndex("LocalizationKeyId");
b.HasIndex("TranslationEngineId");
b.ToTable("LocalizationText");
});
modelBuilder.Entity("FictionArchive.Service.NovelService.Models.Novels.Chapter", b =>
{
b.Property<long>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("bigint");
NpgsqlPropertyBuilderExtensions.UseIdentityByDefaultColumn(b.Property<long>("Id"));
b.Property<Guid>("BodyId")
.HasColumnType("uuid");
b.Property<Instant>("CreatedTime")
.HasColumnType("timestamp with time zone");
b.Property<Instant>("LastUpdatedTime")
.HasColumnType("timestamp with time zone");
b.Property<Guid>("NameId")
.HasColumnType("uuid");
b.Property<long>("Order")
.HasColumnType("bigint");
b.Property<long>("Revision")
.HasColumnType("bigint");
b.Property<string>("Url")
.HasColumnType("text");
b.Property<long>("VolumeId")
.HasColumnType("bigint");
b.HasKey("Id");
b.HasIndex("BodyId");
b.HasIndex("NameId");
b.HasIndex("VolumeId", "Order")
.IsUnique();
b.ToTable("Chapter");
});
modelBuilder.Entity("FictionArchive.Service.NovelService.Models.Novels.Novel", b =>
{
b.Property<long>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("bigint");
NpgsqlPropertyBuilderExtensions.UseIdentityByDefaultColumn(b.Property<long>("Id"));
b.Property<long>("AuthorId")
.HasColumnType("bigint");
b.Property<Guid?>("CoverImageId")
.HasColumnType("uuid");
b.Property<Instant>("CreatedTime")
.HasColumnType("timestamp with time zone");
b.Property<Guid>("DescriptionId")
.HasColumnType("uuid");
b.Property<string>("ExternalId")
.IsRequired()
.HasColumnType("text");
b.Property<Instant>("LastUpdatedTime")
.HasColumnType("timestamp with time zone");
b.Property<Guid>("NameId")
.HasColumnType("uuid");
b.Property<int>("RawLanguage")
.HasColumnType("integer");
b.Property<int>("RawStatus")
.HasColumnType("integer");
b.Property<long>("SourceId")
.HasColumnType("bigint");
b.Property<int?>("StatusOverride")
.HasColumnType("integer");
b.Property<string>("Url")
.IsRequired()
.HasColumnType("text");
b.HasKey("Id");
b.HasIndex("AuthorId");
b.HasIndex("CoverImageId");
b.HasIndex("DescriptionId");
b.HasIndex("NameId");
b.HasIndex("SourceId");
b.HasIndex("ExternalId", "SourceId")
.IsUnique();
b.ToTable("Novels");
});
modelBuilder.Entity("FictionArchive.Service.NovelService.Models.Novels.NovelTag", b =>
{
b.Property<long>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("bigint");
NpgsqlPropertyBuilderExtensions.UseIdentityByDefaultColumn(b.Property<long>("Id"));
b.Property<Instant>("CreatedTime")
.HasColumnType("timestamp with time zone");
b.Property<Guid>("DisplayNameId")
.HasColumnType("uuid");
b.Property<string>("Key")
.IsRequired()
.HasColumnType("text");
b.Property<Instant>("LastUpdatedTime")
.HasColumnType("timestamp with time zone");
b.Property<long?>("SourceId")
.HasColumnType("bigint");
b.Property<int>("TagType")
.HasColumnType("integer");
b.HasKey("Id");
b.HasIndex("DisplayNameId");
b.HasIndex("SourceId");
b.ToTable("Tags");
});
modelBuilder.Entity("FictionArchive.Service.NovelService.Models.Novels.Person", b =>
{
b.Property<long>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("bigint");
NpgsqlPropertyBuilderExtensions.UseIdentityByDefaultColumn(b.Property<long>("Id"));
b.Property<Instant>("CreatedTime")
.HasColumnType("timestamp with time zone");
b.Property<string>("ExternalUrl")
.HasColumnType("text");
b.Property<Instant>("LastUpdatedTime")
.HasColumnType("timestamp with time zone");
b.Property<Guid>("NameId")
.HasColumnType("uuid");
b.HasKey("Id");
b.HasIndex("NameId");
b.ToTable("Person");
});
modelBuilder.Entity("FictionArchive.Service.NovelService.Models.Novels.Source", b =>
{
b.Property<long>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("bigint");
NpgsqlPropertyBuilderExtensions.UseIdentityByDefaultColumn(b.Property<long>("Id"));
b.Property<Instant>("CreatedTime")
.HasColumnType("timestamp with time zone");
b.Property<string>("Key")
.IsRequired()
.HasColumnType("text");
b.Property<Instant>("LastUpdatedTime")
.HasColumnType("timestamp with time zone");
b.Property<string>("Name")
.IsRequired()
.HasColumnType("text");
b.Property<string>("Url")
.IsRequired()
.HasColumnType("text");
b.HasKey("Id");
b.ToTable("Sources");
});
modelBuilder.Entity("FictionArchive.Service.NovelService.Models.Novels.TranslationEngine", b =>
{
b.Property<long>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("bigint");
NpgsqlPropertyBuilderExtensions.UseIdentityByDefaultColumn(b.Property<long>("Id"));
b.Property<Instant>("CreatedTime")
.HasColumnType("timestamp with time zone");
b.Property<string>("Key")
.IsRequired()
.HasColumnType("text");
b.Property<Instant>("LastUpdatedTime")
.HasColumnType("timestamp with time zone");
b.HasKey("Id");
b.ToTable("TranslationEngines");
});
modelBuilder.Entity("FictionArchive.Service.NovelService.Models.Novels.Volume", b =>
{
b.Property<long>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("bigint");
NpgsqlPropertyBuilderExtensions.UseIdentityByDefaultColumn(b.Property<long>("Id"));
b.Property<Instant>("CreatedTime")
.HasColumnType("timestamp with time zone");
b.Property<Instant>("LastUpdatedTime")
.HasColumnType("timestamp with time zone");
b.Property<Guid>("NameId")
.HasColumnType("uuid");
b.Property<long>("NovelId")
.HasColumnType("bigint");
b.Property<int>("Order")
.HasColumnType("integer");
b.HasKey("Id");
b.HasIndex("NameId");
b.HasIndex("NovelId", "Order")
.IsUnique();
b.ToTable("Volume");
});
modelBuilder.Entity("NovelNovelTag", b =>
{
b.Property<long>("NovelsId")
.HasColumnType("bigint");
b.Property<long>("TagsId")
.HasColumnType("bigint");
b.HasKey("NovelsId", "TagsId");
b.HasIndex("TagsId");
b.ToTable("NovelNovelTag");
});
modelBuilder.Entity("FictionArchive.Service.NovelService.Models.Images.Image", b =>
{
b.HasOne("FictionArchive.Service.NovelService.Models.Novels.Chapter", "Chapter")
.WithMany("Images")
.HasForeignKey("ChapterId");
b.Navigation("Chapter");
});
modelBuilder.Entity("FictionArchive.Service.NovelService.Models.Localization.LocalizationRequest", b =>
{
b.HasOne("FictionArchive.Service.NovelService.Models.Novels.TranslationEngine", "Engine")
.WithMany()
.HasForeignKey("EngineId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
b.HasOne("FictionArchive.Service.NovelService.Models.Localization.LocalizationKey", "KeyRequestedForTranslation")
.WithMany()
.HasForeignKey("KeyRequestedForTranslationId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
b.Navigation("Engine");
b.Navigation("KeyRequestedForTranslation");
});
modelBuilder.Entity("FictionArchive.Service.NovelService.Models.Localization.LocalizationText", b =>
{
b.HasOne("FictionArchive.Service.NovelService.Models.Localization.LocalizationKey", null)
.WithMany("Texts")
.HasForeignKey("LocalizationKeyId");
b.HasOne("FictionArchive.Service.NovelService.Models.Novels.TranslationEngine", "TranslationEngine")
.WithMany()
.HasForeignKey("TranslationEngineId");
b.Navigation("TranslationEngine");
});
modelBuilder.Entity("FictionArchive.Service.NovelService.Models.Novels.Chapter", b =>
{
b.HasOne("FictionArchive.Service.NovelService.Models.Localization.LocalizationKey", "Body")
.WithMany()
.HasForeignKey("BodyId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
b.HasOne("FictionArchive.Service.NovelService.Models.Localization.LocalizationKey", "Name")
.WithMany()
.HasForeignKey("NameId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
b.HasOne("FictionArchive.Service.NovelService.Models.Novels.Volume", "Volume")
.WithMany("Chapters")
.HasForeignKey("VolumeId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
b.Navigation("Body");
b.Navigation("Name");
b.Navigation("Volume");
});
modelBuilder.Entity("FictionArchive.Service.NovelService.Models.Novels.Novel", b =>
{
b.HasOne("FictionArchive.Service.NovelService.Models.Novels.Person", "Author")
.WithMany()
.HasForeignKey("AuthorId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
b.HasOne("FictionArchive.Service.NovelService.Models.Images.Image", "CoverImage")
.WithMany()
.HasForeignKey("CoverImageId");
b.HasOne("FictionArchive.Service.NovelService.Models.Localization.LocalizationKey", "Description")
.WithMany()
.HasForeignKey("DescriptionId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
b.HasOne("FictionArchive.Service.NovelService.Models.Localization.LocalizationKey", "Name")
.WithMany()
.HasForeignKey("NameId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
b.HasOne("FictionArchive.Service.NovelService.Models.Novels.Source", "Source")
.WithMany()
.HasForeignKey("SourceId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
b.Navigation("Author");
b.Navigation("CoverImage");
b.Navigation("Description");
b.Navigation("Name");
b.Navigation("Source");
});
modelBuilder.Entity("FictionArchive.Service.NovelService.Models.Novels.NovelTag", b =>
{
b.HasOne("FictionArchive.Service.NovelService.Models.Localization.LocalizationKey", "DisplayName")
.WithMany()
.HasForeignKey("DisplayNameId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
b.HasOne("FictionArchive.Service.NovelService.Models.Novels.Source", "Source")
.WithMany()
.HasForeignKey("SourceId");
b.Navigation("DisplayName");
b.Navigation("Source");
});
modelBuilder.Entity("FictionArchive.Service.NovelService.Models.Novels.Person", b =>
{
b.HasOne("FictionArchive.Service.NovelService.Models.Localization.LocalizationKey", "Name")
.WithMany()
.HasForeignKey("NameId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
b.Navigation("Name");
});
modelBuilder.Entity("FictionArchive.Service.NovelService.Models.Novels.Volume", b =>
{
b.HasOne("FictionArchive.Service.NovelService.Models.Localization.LocalizationKey", "Name")
.WithMany()
.HasForeignKey("NameId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
b.HasOne("FictionArchive.Service.NovelService.Models.Novels.Novel", "Novel")
.WithMany("Volumes")
.HasForeignKey("NovelId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
b.Navigation("Name");
b.Navigation("Novel");
});
modelBuilder.Entity("NovelNovelTag", b =>
{
b.HasOne("FictionArchive.Service.NovelService.Models.Novels.Novel", null)
.WithMany()
.HasForeignKey("NovelsId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
b.HasOne("FictionArchive.Service.NovelService.Models.Novels.NovelTag", null)
.WithMany()
.HasForeignKey("TagsId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
});
modelBuilder.Entity("FictionArchive.Service.NovelService.Models.Localization.LocalizationKey", b =>
{
b.Navigation("Texts");
});
modelBuilder.Entity("FictionArchive.Service.NovelService.Models.Novels.Chapter", b =>
{
b.Navigation("Images");
});
modelBuilder.Entity("FictionArchive.Service.NovelService.Models.Novels.Novel", b =>
{
b.Navigation("Volumes");
});
modelBuilder.Entity("FictionArchive.Service.NovelService.Models.Novels.Volume", b =>
{
b.Navigation("Chapters");
});
#pragma warning restore 612, 618
}
}
}

View File

@@ -0,0 +1,195 @@
using System;
using Microsoft.EntityFrameworkCore.Migrations;
using NodaTime;
using Npgsql.EntityFrameworkCore.PostgreSQL.Metadata;
#nullable disable
namespace FictionArchive.Service.NovelService.Migrations
{
/// <inheritdoc />
public partial class AddVolumes : Migration
{
/// <inheritdoc />
protected override void Up(MigrationBuilder migrationBuilder)
{
// 1. Create the Volume table
migrationBuilder.CreateTable(
name: "Volume",
columns: table => new
{
Id = table.Column<long>(type: "bigint", nullable: false)
.Annotation("Npgsql:ValueGenerationStrategy", NpgsqlValueGenerationStrategy.IdentityByDefaultColumn),
Order = table.Column<int>(type: "integer", nullable: false),
NameId = table.Column<Guid>(type: "uuid", nullable: false),
NovelId = table.Column<long>(type: "bigint", nullable: false),
CreatedTime = table.Column<Instant>(type: "timestamp with time zone", nullable: false),
LastUpdatedTime = table.Column<Instant>(type: "timestamp with time zone", nullable: false)
},
constraints: table =>
{
table.PrimaryKey("PK_Volume", x => x.Id);
table.ForeignKey(
name: "FK_Volume_LocalizationKeys_NameId",
column: x => x.NameId,
principalTable: "LocalizationKeys",
principalColumn: "Id",
onDelete: ReferentialAction.Cascade);
table.ForeignKey(
name: "FK_Volume_Novels_NovelId",
column: x => x.NovelId,
principalTable: "Novels",
principalColumn: "Id",
onDelete: ReferentialAction.Cascade);
});
migrationBuilder.CreateIndex(
name: "IX_Volume_NameId",
table: "Volume",
column: "NameId");
migrationBuilder.CreateIndex(
name: "IX_Volume_NovelId_Order",
table: "Volume",
columns: new[] { "NovelId", "Order" },
unique: true);
// 2. Add nullable VolumeId column to Chapter (keep NovelId for now)
migrationBuilder.AddColumn<long>(
name: "VolumeId",
table: "Chapter",
type: "bigint",
nullable: true);
// 3. Data migration: Create volumes and link chapters for each novel
migrationBuilder.Sql(@"
DO $$
DECLARE
novel_rec RECORD;
loc_key_id uuid;
volume_id bigint;
BEGIN
FOR novel_rec IN SELECT ""Id"", ""RawLanguage"" FROM ""Novels"" LOOP
-- Create LocalizationKey for volume name
loc_key_id := gen_random_uuid();
INSERT INTO ""LocalizationKeys"" (""Id"", ""CreatedTime"", ""LastUpdatedTime"")
VALUES (loc_key_id, NOW(), NOW());
-- Create LocalizationText for 'Main Story' in novel's raw language
INSERT INTO ""LocalizationText"" (""Id"", ""LocalizationKeyId"", ""Language"", ""Text"", ""CreatedTime"", ""LastUpdatedTime"")
VALUES (gen_random_uuid(), loc_key_id, novel_rec.""RawLanguage"", 'Main Story', NOW(), NOW());
-- Create Volume for this novel
INSERT INTO ""Volume"" (""Order"", ""NameId"", ""NovelId"", ""CreatedTime"", ""LastUpdatedTime"")
VALUES (1, loc_key_id, novel_rec.""Id"", NOW(), NOW())
RETURNING ""Id"" INTO volume_id;
-- Link all chapters of this novel to the new volume
UPDATE ""Chapter"" SET ""VolumeId"" = volume_id WHERE ""NovelId"" = novel_rec.""Id"";
END LOOP;
END $$;
");
// 4. Drop old FK and index for NovelId
migrationBuilder.DropForeignKey(
name: "FK_Chapter_Novels_NovelId",
table: "Chapter");
migrationBuilder.DropIndex(
name: "IX_Chapter_NovelId",
table: "Chapter");
// 5. Drop NovelId column from Chapter
migrationBuilder.DropColumn(
name: "NovelId",
table: "Chapter");
// 6. Make VolumeId non-nullable
migrationBuilder.AlterColumn<long>(
name: "VolumeId",
table: "Chapter",
type: "bigint",
nullable: false,
oldClrType: typeof(long),
oldType: "bigint",
oldNullable: true);
// 7. Add unique index and FK for VolumeId
migrationBuilder.CreateIndex(
name: "IX_Chapter_VolumeId_Order",
table: "Chapter",
columns: new[] { "VolumeId", "Order" },
unique: true);
migrationBuilder.AddForeignKey(
name: "FK_Chapter_Volume_VolumeId",
table: "Chapter",
column: "VolumeId",
principalTable: "Volume",
principalColumn: "Id",
onDelete: ReferentialAction.Cascade);
}
/// <inheritdoc />
protected override void Down(MigrationBuilder migrationBuilder)
{
// Add back NovelId column
migrationBuilder.AddColumn<long>(
name: "NovelId",
table: "Chapter",
type: "bigint",
nullable: true);
// Migrate data back: set NovelId from Volume
migrationBuilder.Sql(@"
UPDATE ""Chapter"" c
SET ""NovelId"" = v.""NovelId""
FROM ""Volume"" v
WHERE c.""VolumeId"" = v.""Id"";
");
// Make NovelId non-nullable
migrationBuilder.AlterColumn<long>(
name: "NovelId",
table: "Chapter",
type: "bigint",
nullable: false,
oldClrType: typeof(long),
oldType: "bigint",
oldNullable: true);
// Drop VolumeId FK and index
migrationBuilder.DropForeignKey(
name: "FK_Chapter_Volume_VolumeId",
table: "Chapter");
migrationBuilder.DropIndex(
name: "IX_Chapter_VolumeId_Order",
table: "Chapter");
// Drop VolumeId column
migrationBuilder.DropColumn(
name: "VolumeId",
table: "Chapter");
// Recreate NovelId index and FK
migrationBuilder.CreateIndex(
name: "IX_Chapter_NovelId",
table: "Chapter",
column: "NovelId");
migrationBuilder.AddForeignKey(
name: "FK_Chapter_Novels_NovelId",
table: "Chapter",
column: "NovelId",
principalTable: "Novels",
principalColumn: "Id",
onDelete: ReferentialAction.Cascade);
// Note: Volume LocalizationKeys are not cleaned up in Down migration
// as they may have been modified. Manual cleanup may be needed.
migrationBuilder.DropTable(
name: "Volume");
}
}
}

View File

@@ -153,9 +153,6 @@ namespace FictionArchive.Service.NovelService.Migrations
b.Property<Guid>("NameId")
.HasColumnType("uuid");
b.Property<long>("NovelId")
.HasColumnType("bigint");
b.Property<long>("Order")
.HasColumnType("bigint");
@@ -165,13 +162,17 @@ namespace FictionArchive.Service.NovelService.Migrations
b.Property<string>("Url")
.HasColumnType("text");
b.Property<long>("VolumeId")
.HasColumnType("bigint");
b.HasKey("Id");
b.HasIndex("BodyId");
b.HasIndex("NameId");
b.HasIndex("NovelId");
b.HasIndex("VolumeId", "Order")
.IsUnique();
b.ToTable("Chapter");
});
@@ -357,6 +358,39 @@ namespace FictionArchive.Service.NovelService.Migrations
b.ToTable("TranslationEngines");
});
modelBuilder.Entity("FictionArchive.Service.NovelService.Models.Novels.Volume", b =>
{
b.Property<long>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("bigint");
NpgsqlPropertyBuilderExtensions.UseIdentityByDefaultColumn(b.Property<long>("Id"));
b.Property<Instant>("CreatedTime")
.HasColumnType("timestamp with time zone");
b.Property<Instant>("LastUpdatedTime")
.HasColumnType("timestamp with time zone");
b.Property<Guid>("NameId")
.HasColumnType("uuid");
b.Property<long>("NovelId")
.HasColumnType("bigint");
b.Property<int>("Order")
.HasColumnType("integer");
b.HasKey("Id");
b.HasIndex("NameId");
b.HasIndex("NovelId", "Order")
.IsUnique();
b.ToTable("Volume");
});
modelBuilder.Entity("NovelNovelTag", b =>
{
b.Property<long>("NovelsId")
@@ -427,9 +461,9 @@ namespace FictionArchive.Service.NovelService.Migrations
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
b.HasOne("FictionArchive.Service.NovelService.Models.Novels.Novel", "Novel")
b.HasOne("FictionArchive.Service.NovelService.Models.Novels.Volume", "Volume")
.WithMany("Chapters")
.HasForeignKey("NovelId")
.HasForeignKey("VolumeId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
@@ -437,7 +471,7 @@ namespace FictionArchive.Service.NovelService.Migrations
b.Navigation("Name");
b.Navigation("Novel");
b.Navigation("Volume");
});
modelBuilder.Entity("FictionArchive.Service.NovelService.Models.Novels.Novel", b =>
@@ -509,6 +543,25 @@ namespace FictionArchive.Service.NovelService.Migrations
b.Navigation("Name");
});
modelBuilder.Entity("FictionArchive.Service.NovelService.Models.Novels.Volume", b =>
{
b.HasOne("FictionArchive.Service.NovelService.Models.Localization.LocalizationKey", "Name")
.WithMany()
.HasForeignKey("NameId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
b.HasOne("FictionArchive.Service.NovelService.Models.Novels.Novel", "Novel")
.WithMany("Volumes")
.HasForeignKey("NovelId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
b.Navigation("Name");
b.Navigation("Novel");
});
modelBuilder.Entity("NovelNovelTag", b =>
{
b.HasOne("FictionArchive.Service.NovelService.Models.Novels.Novel", null)
@@ -535,6 +588,11 @@ namespace FictionArchive.Service.NovelService.Migrations
});
modelBuilder.Entity("FictionArchive.Service.NovelService.Models.Novels.Novel", b =>
{
b.Navigation("Volumes");
});
modelBuilder.Entity("FictionArchive.Service.NovelService.Models.Novels.Volume", b =>
{
b.Navigation("Chapters");
});

View File

@@ -12,7 +12,16 @@ public class ChapterReaderDto : BaseDto<uint>
// Navigation context
public uint NovelId { get; init; }
public required string NovelName { get; init; }
public int TotalChapters { get; init; }
// Volume context
public uint VolumeId { get; init; }
public required string VolumeName { get; init; }
public int VolumeOrder { get; init; }
public int TotalChaptersInVolume { get; init; }
// Cross-volume navigation (VolumeOrder + ChapterOrder identify a chapter)
public int? PrevChapterVolumeOrder { get; init; }
public uint? PrevChapterOrder { get; init; }
public int? NextChapterVolumeOrder { get; init; }
public uint? NextChapterOrder { get; init; }
}

View File

@@ -14,7 +14,7 @@ public class NovelDto : BaseDto<uint>
public required string ExternalId { get; init; }
public required string Name { get; init; }
public required string Description { get; init; }
public required List<ChapterDto> Chapters { get; init; }
public required List<VolumeDto> Volumes { get; init; }
public required List<NovelTagDto> Tags { get; init; }
public ImageDto? CoverImage { get; init; }
}

View File

@@ -0,0 +1,8 @@
namespace FictionArchive.Service.NovelService.Models.DTOs;
public class VolumeDto : BaseDto<uint>
{
public int Order { get; init; }
public required string Name { get; init; }
public required List<ChapterDto> Chapters { get; init; }
}

View File

@@ -0,0 +1,13 @@
using FictionArchive.Service.Shared.Services.EventBus;
namespace FictionArchive.Service.NovelService.Models.IntegrationEvents;
public class ChapterCreatedEvent : IIntegrationEvent
{
public required uint ChapterId { get; init; }
public required uint NovelId { get; init; }
public required uint VolumeId { get; init; }
public required int VolumeOrder { get; init; }
public required uint ChapterOrder { get; init; }
public required string ChapterTitle { get; init; }
}

View File

@@ -5,5 +5,6 @@ namespace FictionArchive.Service.NovelService.Models.IntegrationEvents;
public class ChapterPullRequestedEvent : IIntegrationEvent
{
public uint NovelId { get; set; }
public uint ChapterNumber { get; set; }
public uint VolumeId { get; set; }
public uint ChapterOrder { get; set; }
}

View File

@@ -0,0 +1,13 @@
using FictionArchive.Common.Enums;
using FictionArchive.Service.Shared.Services.EventBus;
namespace FictionArchive.Service.NovelService.Models.IntegrationEvents;
public class NovelCreatedEvent : IIntegrationEvent
{
public required uint NovelId { get; init; }
public required string Title { get; init; }
public required Language OriginalLanguage { get; init; }
public required string Source { get; init; }
public required string AuthorName { get; init; }
}

View File

@@ -20,7 +20,7 @@ public class Chapter : BaseEntity<uint>
#region Navigation Properties
public Novel Novel { get; set; }
public Volume Volume { get; set; }
#endregion
}

View File

@@ -21,7 +21,7 @@ public class Novel : BaseEntity<uint>
public LocalizationKey Name { get; set; }
public LocalizationKey Description { get; set; }
public List<Chapter> Chapters { get; set; }
public List<Volume> Volumes { get; set; }
public List<NovelTag> Tags { get; set; }
public Image? CoverImage { get; set; }
}

View File

@@ -0,0 +1,24 @@
using System.ComponentModel.DataAnnotations.Schema;
using FictionArchive.Service.NovelService.Models.Localization;
using FictionArchive.Service.Shared.Models;
namespace FictionArchive.Service.NovelService.Models.Novels;
[Table("Volume")]
public class Volume : BaseEntity<uint>
{
/// <summary>
/// Signed int to allow special ordering like -1 for "Author Notes" at top.
/// </summary>
public int Order { get; set; }
public LocalizationKey Name { get; set; }
public List<Chapter> Chapters { get; set; }
#region Navigation Properties
public Novel Novel { get; set; }
#endregion
}

View File

@@ -16,7 +16,7 @@ public class NovelMetadata
public Language RawLanguage { get; set; }
public NovelStatus RawStatus { get; set; }
public List<ChapterMetadata> Chapters { get; set; }
public List<VolumeMetadata> Volumes { get; set; }
public List<string> SourceTags { get; set; }
public List<string> SystemTags { get; set; }
public SourceDescriptor SourceDescriptor { get; set; }

View File

@@ -0,0 +1,8 @@
namespace FictionArchive.Service.NovelService.Models.SourceAdapters;
public class VolumeMetadata
{
public int Order { get; set; }
public string Name { get; set; }
public List<ChapterMetadata> Chapters { get; set; }
}

View File

@@ -44,7 +44,6 @@ public class Program
#region GraphQL
builder.Services.AddDefaultGraphQl<Query, Mutation>()
.ModifyCostOptions(opt => opt.MaxFieldCost = 5000)
.AddAuthorization();
#endregion
@@ -63,12 +62,14 @@ public class Program
builder.Services.AddHttpClient<NovelpiaAuthMessageHandler>(client =>
{
client.BaseAddress = new Uri("https://novelpia.com");
});
})
.AddStandardResilienceHandler();
builder.Services.AddHttpClient<ISourceAdapter, NovelpiaAdapter>(client =>
{
client.BaseAddress = new Uri("https://novelpia.com");
})
.AddHttpMessageHandler<NovelpiaAuthMessageHandler>();
.AddHttpMessageHandler<NovelpiaAuthMessageHandler>()
.AddStandardResilienceHandler();
builder.Services.Configure<NovelUpdateServiceConfiguration>(builder.Configuration.GetSection("UpdateService"));
builder.Services.AddTransient<NovelUpdateService>();

View File

@@ -14,6 +14,6 @@ public class ChapterPullRequestedEventHandler : IIntegrationEventHandler<Chapter
public async Task Handle(ChapterPullRequestedEvent @event)
{
await _novelUpdateService.PullChapterContents(@event.NovelId, @event.ChapterNumber);
await _novelUpdateService.PullChapterContents(@event.NovelId, @event.VolumeId, @event.ChapterOrder);
}
}

View File

@@ -10,6 +10,7 @@ public class NovelServiceDbContext(DbContextOptions options, ILogger<NovelServic
: FictionArchiveDbContext(options, logger)
{
public DbSet<Novel> Novels { get; set; }
public DbSet<Volume> Volumes { get; set; }
public DbSet<Chapter> Chapters { get; set; }
public DbSet<Source> Sources { get; set; }
public DbSet<TranslationEngine> TranslationEngines { get; set; }
@@ -25,5 +26,15 @@ public class NovelServiceDbContext(DbContextOptions options, ILogger<NovelServic
modelBuilder.Entity<Novel>()
.HasIndex("ExternalId", "SourceId")
.IsUnique();
// Volume.Order is unique per Novel
modelBuilder.Entity<Volume>()
.HasIndex("NovelId", "Order")
.IsUnique();
// Chapter.Order is unique per Volume
modelBuilder.Entity<Chapter>()
.HasIndex("VolumeId", "Order")
.IsUnique();
}
}

View File

@@ -190,6 +190,48 @@ public class NovelUpdateService
return existingChapters.Concat(newChapters).ToList();
}
private static List<Volume> SynchronizeVolumes(
List<VolumeMetadata> metadataVolumes,
Language rawLanguage,
List<Volume>? existingVolumes)
{
existingVolumes ??= new List<Volume>();
var result = new List<Volume>();
foreach (var metaVolume in metadataVolumes)
{
// Match volumes by Order (unique per novel)
var existingVolume = existingVolumes.FirstOrDefault(v => v.Order == metaVolume.Order);
if (existingVolume != null)
{
// Volume exists - sync its chapters
existingVolume.Chapters = SynchronizeChapters(
metaVolume.Chapters,
rawLanguage,
existingVolume.Chapters);
result.Add(existingVolume);
}
else
{
// New volume - create it with synced chapters
var newVolume = new Volume
{
Order = metaVolume.Order,
Name = LocalizationKey.CreateFromText(metaVolume.Name, rawLanguage),
Chapters = SynchronizeChapters(metaVolume.Chapters, rawLanguage, null)
};
result.Add(newVolume);
}
}
// Keep existing volumes not in metadata (user-created volumes)
var metaOrders = metadataVolumes.Select(v => v.Order).ToHashSet();
result.AddRange(existingVolumes.Where(v => !metaOrders.Contains(v.Order)));
return result;
}
private static (Image? image, bool shouldPublishEvent) HandleCoverImage(
ImageData? newCoverData,
Image? existingCoverImage)
@@ -232,7 +274,7 @@ public class NovelUpdateService
metadata.SystemTags,
metadata.RawLanguage);
var chapters = SynchronizeChapters(metadata.Chapters, metadata.RawLanguage, null);
var volumes = SynchronizeVolumes(metadata.Volumes, metadata.RawLanguage, null);
var novel = new Novel
{
@@ -243,7 +285,7 @@ public class NovelUpdateService
CoverImage = metadata.CoverImage != null
? new Image { OriginalPath = metadata.CoverImage.Url }
: null,
Chapters = chapters,
Volumes = volumes,
Description = LocalizationKey.CreateFromText(metadata.Description, metadata.RawLanguage),
Name = LocalizationKey.CreateFromText(metadata.Name, metadata.RawLanguage),
RawStatus = metadata.RawStatus,
@@ -289,7 +331,10 @@ public class NovelUpdateService
.Include(n => n.Description)
.ThenInclude(lk => lk.Texts)
.Include(n => n.Tags)
.Include(n => n.Chapters)
.Include(n => n.Volumes)
.ThenInclude(volume => volume.Chapters)
.ThenInclude(chapter => chapter.Body)
.ThenInclude(localizationKey => localizationKey.Texts)
.Include(n => n.CoverImage)
.FirstOrDefaultAsync(n =>
n.ExternalId == metadata.ExternalId &&
@@ -298,6 +343,12 @@ public class NovelUpdateService
Novel novel;
bool shouldPublishCoverEvent;
// Capture existing chapter IDs to detect new chapters later
var existingChapterIds = existingNovel?.Volumes
.SelectMany(v => v.Chapters)
.Select(c => c.Id)
.ToHashSet() ?? new HashSet<uint>();
if (existingNovel == null)
{
// CREATE PATH: New novel
@@ -325,11 +376,11 @@ public class NovelUpdateService
metadata.SystemTags,
metadata.RawLanguage);
// Synchronize chapters (add only)
novel.Chapters = SynchronizeChapters(
metadata.Chapters,
// Synchronize volumes (and their chapters)
novel.Volumes = SynchronizeVolumes(
metadata.Volumes,
metadata.RawLanguage,
existingNovel.Chapters);
existingNovel.Volumes);
// Handle cover image
(novel.CoverImage, shouldPublishCoverEvent) = HandleCoverImage(
@@ -339,6 +390,36 @@ public class NovelUpdateService
await _dbContext.SaveChangesAsync();
// Publish novel created event for new novels
if (existingNovel == null)
{
await _eventBus.Publish(new NovelCreatedEvent
{
NovelId = novel.Id,
Title = novel.Name.Texts.First(t => t.Language == novel.RawLanguage).Text,
OriginalLanguage = novel.RawLanguage,
Source = novel.Source.Key,
AuthorName = novel.Author.Name.Texts.First(t => t.Language == novel.RawLanguage).Text
});
}
// Publish chapter created events for new chapters
foreach (var volume in novel.Volumes)
{
foreach (var chapter in volume.Chapters.Where(c => !existingChapterIds.Contains(c.Id)))
{
await _eventBus.Publish(new ChapterCreatedEvent
{
ChapterId = chapter.Id,
NovelId = novel.Id,
VolumeId = volume.Id,
VolumeOrder = volume.Order,
ChapterOrder = chapter.Order,
ChapterTitle = chapter.Name.Texts.First(t => t.Language == novel.RawLanguage).Text
});
}
}
// Publish cover image event if needed
if (shouldPublishCoverEvent && novel.CoverImage != null && metadata.CoverImage != null)
{
@@ -351,7 +432,9 @@ public class NovelUpdateService
}
// Publish chapter pull events for chapters without body content
var chaptersNeedingPull = novel.Chapters
foreach (var volume in novel.Volumes)
{
var chaptersNeedingPull = volume.Chapters
.Where(c => c.Body?.Texts == null || !c.Body.Texts.Any())
.ToList();
@@ -360,30 +443,48 @@ public class NovelUpdateService
await _eventBus.Publish(new ChapterPullRequestedEvent
{
NovelId = novel.Id,
ChapterNumber = chapter.Order
VolumeId = volume.Id,
ChapterOrder = chapter.Order
});
}
}
return novel;
}
public async Task<Chapter> PullChapterContents(uint novelId, uint chapterNumber)
public async Task<Chapter> PullChapterContents(uint novelId, uint volumeId, uint chapterOrder)
{
var novel = await _dbContext.Novels.Where(novel => novel.Id == novelId)
.Include(novel => novel.Chapters)
.Include(novel => novel.Volumes)
.ThenInclude(volume => volume.Chapters)
.ThenInclude(chapter => chapter.Body)
.ThenInclude(body => body.Texts)
.Include(novel => novel.Source).Include(novel => novel.Chapters).ThenInclude(chapter => chapter.Images)
.Include(novel => novel.Source)
.Include(novel => novel.Volumes)
.ThenInclude(volume => volume.Chapters)
.ThenInclude(chapter => chapter.Images)
.FirstOrDefaultAsync();
var chapter = novel.Chapters.Where(chapter => chapter.Order == chapterNumber).FirstOrDefault();
var volume = novel.Volumes.FirstOrDefault(v => v.Id == volumeId);
var chapter = volume.Chapters.FirstOrDefault(c => c.Order == chapterOrder);
var adapter = _sourceAdapters.FirstOrDefault(adapter => adapter.SourceDescriptor.Key == novel.Source.Key);
var rawChapter = await adapter.GetRawChapter(chapter.Url);
var localizationText = new LocalizationText()
// If we already have the raw for this, overwrite it for now. Revisions will come later.
var localizationText = chapter.Body.Texts.FirstOrDefault(text => text.Language == novel.RawLanguage);
if (localizationText == null)
{
localizationText = new LocalizationText()
{
Text = rawChapter.Text,
Language = novel.RawLanguage
};
chapter.Body.Texts.Add(localizationText);
}
else
{
localizationText.Text = rawChapter.Text;
}
chapter.Images = rawChapter.ImageData.Select(img => new Image()
{
OriginalPath = img.Url
@@ -466,14 +567,72 @@ public class NovelUpdateService
return importNovelRequestEvent;
}
public async Task<ChapterPullRequestedEvent> QueueChapterPull(uint novelId, uint chapterNumber)
public async Task<ChapterPullRequestedEvent> QueueChapterPull(uint novelId, uint volumeId, uint chapterOrder)
{
var chapterPullEvent = new ChapterPullRequestedEvent()
{
NovelId = novelId,
ChapterNumber = chapterNumber
VolumeId = volumeId,
ChapterOrder = chapterOrder
};
await _eventBus.Publish(chapterPullEvent);
return chapterPullEvent;
}
public async Task DeleteNovel(uint novelId)
{
var novel = await _dbContext.Novels
.Include(n => n.CoverImage)
.Include(n => n.Name).ThenInclude(k => k.Texts)
.Include(n => n.Description).ThenInclude(k => k.Texts)
.Include(n => n.Volumes).ThenInclude(v => v.Name).ThenInclude(k => k.Texts)
.Include(n => n.Volumes).ThenInclude(v => v.Chapters).ThenInclude(c => c.Images)
.Include(n => n.Volumes).ThenInclude(v => v.Chapters).ThenInclude(c => c.Name).ThenInclude(k => k.Texts)
.Include(n => n.Volumes).ThenInclude(v => v.Chapters).ThenInclude(c => c.Body).ThenInclude(k => k.Texts)
.FirstOrDefaultAsync(n => n.Id == novelId);
if (novel == null)
throw new KeyNotFoundException($"Novel with ID '{novelId}' not found");
// Collect all LocalizationKey IDs for cleanup
var locKeyIds = new List<Guid> { novel.Name.Id, novel.Description.Id };
foreach (var volume in novel.Volumes)
{
locKeyIds.Add(volume.Name.Id);
locKeyIds.AddRange(volume.Chapters.Select(c => c.Name.Id));
locKeyIds.AddRange(volume.Chapters.Select(c => c.Body.Id));
}
// 1. Remove LocalizationRequests referencing these keys
var locRequests = await _dbContext.LocalizationRequests
.Where(r => locKeyIds.Contains(r.KeyRequestedForTranslation.Id))
.ToListAsync();
_dbContext.LocalizationRequests.RemoveRange(locRequests);
// 2. Remove LocalizationTexts (NO ACTION FK - won't cascade)
_dbContext.RemoveRange(novel.Name.Texts);
_dbContext.RemoveRange(novel.Description.Texts);
foreach (var volume in novel.Volumes)
{
_dbContext.RemoveRange(volume.Name.Texts);
foreach (var chapter in volume.Chapters)
{
_dbContext.RemoveRange(chapter.Name.Texts);
_dbContext.RemoveRange(chapter.Body.Texts);
}
}
// 3. Remove Images (NO ACTION FK - won't cascade)
if (novel.CoverImage != null)
_dbContext.Images.Remove(novel.CoverImage);
foreach (var volume in novel.Volumes)
{
foreach (var chapter in volume.Chapters)
_dbContext.Images.RemoveRange(chapter.Images);
}
// 4. Remove novel - cascades: volumes, chapters, localization keys, tag mappings
_dbContext.Novels.Remove(novel);
await _dbContext.SaveChangesAsync();
}
}

View File

@@ -66,7 +66,7 @@ public class NovelpiaAdapter : ISourceAdapter
ExternalId = novelId.ToString(),
SystemTags = new List<string>(),
SourceTags = new List<string>(),
Chapters = new List<ChapterMetadata>(),
Volumes = new List<VolumeMetadata>(),
SourceDescriptor = SourceDescriptor
};
@@ -133,6 +133,9 @@ public class NovelpiaAdapter : ISourceAdapter
novel.SourceTags.Add(tag);
}
// Author's posts (from notice_table in the page HTML)
var authorsPosts = ParseAuthorsPosts(novelData);
// Chapters
uint page = 0;
List<ChapterMetadata> chapters = new List<ChapterMetadata>();
@@ -168,7 +171,25 @@ public class NovelpiaAdapter : ISourceAdapter
}
page++;
}
novel.Chapters = chapters;
// Add Author's Posts volume if there are any
if (authorsPosts.Count > 0)
{
novel.Volumes.Add(new VolumeMetadata
{
Order = 0,
Name = "Author's Posts",
Chapters = authorsPosts
});
}
// Main Story volume
novel.Volumes.Add(new VolumeMetadata
{
Order = 1,
Name = "Main Story",
Chapters = chapters
});
return novel;
}
@@ -241,4 +262,40 @@ public class NovelpiaAdapter : ISourceAdapter
}
return await image.Content.ReadAsByteArrayAsync();
}
private List<ChapterMetadata> ParseAuthorsPosts(string novelHtml)
{
var posts = new List<ChapterMetadata>();
// Find the notice_table section
var noticeTableMatch = Regex.Match(novelHtml,
@"(?s)<table[^>]*class=""notice_table[^""]*""[^>]*>(.*?)</table>");
if (!noticeTableMatch.Success)
return posts;
var tableContent = noticeTableMatch.Groups[1].Value;
// Find all td elements with onclick containing viewer URL and extract title from <b>
// HTML structure: <td ... onclick="...location='/viewer/3330612';"><b>Title</b>
var postMatches = Regex.Matches(tableContent,
@"onclick=""[^""]*location='/viewer/(\d+)'[^""]*""[^>]*><b>([^<]+)</b>");
uint order = 1;
foreach (Match match in postMatches)
{
string viewerId = match.Groups[1].Value;
string title = WebUtility.HtmlDecode(match.Groups[2].Value.Trim());
posts.Add(new ChapterMetadata
{
Revision = 0,
Order = order,
Url = $"https://novelpia.com/viewer/{viewerId}",
Name = title
});
order++;
}
return posts;
}
}

View File

@@ -6,6 +6,7 @@ using Microsoft.IdentityModel.Tokens;
using FictionArchive.Service.Shared.Constants;
using FictionArchive.Service.Shared.Models.Authentication;
using System.Linq;
using System.Security.Claims;
namespace FictionArchive.Service.Shared.Extensions;
@@ -78,7 +79,7 @@ public static class AuthenticationExtensions
logger.LogDebug(
"JWT token validated for subject: {Subject}",
context.Principal?.FindFirst("sub")?.Value ?? "unknown");
context.Principal?.FindFirst(ClaimTypes.NameIdentifier)?.Value ?? "unknown");
return existingEvents?.OnTokenValidated?.Invoke(context) ?? Task.CompletedTask;
}

View File

@@ -22,6 +22,7 @@ public static class GraphQLExtensions
.AddErrorFilter<LoggingErrorFilter>()
.AddType<UnsignedIntType>()
.AddType<InstantType>()
.ModifyCostOptions(opt => opt.MaxFieldCost = 10000)
.AddMutationConventions(applyToAllMutations: true)
.AddFiltering(opt => opt.AddDefaults().BindRuntimeType<uint, UnsignedIntOperationFilterInputType>())
.AddSorting()

View File

@@ -25,10 +25,12 @@
<PrivateAssets>all</PrivateAssets>
<IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
</PackageReference>
<PackageReference Include="Microsoft.Extensions.Http.Resilience" Version="10.1.0" />
<PackageReference Include="Newtonsoft.Json" Version="13.0.3" />
<PackageReference Include="NodaTime.Serialization.JsonNet" Version="3.2.0" />
<PackageReference Include="Npgsql.EntityFrameworkCore.PostgreSQL" Version="9.0.4" />
<PackageReference Include="Npgsql.EntityFrameworkCore.PostgreSQL.NodaTime" Version="9.0.4" />
<PackageReference Include="Polly" Version="8.6.5" />
<PackageReference Include="RabbitMQ.Client" Version="7.2.0" />
<PackageReference Include="Microsoft.AspNetCore.Authentication.JwtBearer" Version="8.0.11" />
</ItemGroup>

View File

@@ -0,0 +1,23 @@
FROM mcr.microsoft.com/dotnet/aspnet:8.0 AS base
USER $APP_UID
WORKDIR /app
EXPOSE 8080
EXPOSE 8081
FROM mcr.microsoft.com/dotnet/sdk:8.0 AS build
ARG BUILD_CONFIGURATION=Release
WORKDIR /src
COPY ["FictionArchive.Service.UserNovelDataService/FictionArchive.Service.UserNovelDataService.csproj", "FictionArchive.Service.UserNovelDataService/"]
RUN dotnet restore "FictionArchive.Service.UserNovelDataService/FictionArchive.Service.UserNovelDataService.csproj"
COPY . .
WORKDIR "/src/FictionArchive.Service.UserNovelDataService"
RUN dotnet build "./FictionArchive.Service.UserNovelDataService.csproj" -c $BUILD_CONFIGURATION -o /app/build
FROM build AS publish
ARG BUILD_CONFIGURATION=Release
RUN dotnet publish "./FictionArchive.Service.UserNovelDataService.csproj" -c $BUILD_CONFIGURATION -o /app/publish /p:UseAppHost=false
FROM base AS final
WORKDIR /app
COPY --from=publish /app/publish .
ENTRYPOINT ["dotnet", "FictionArchive.Service.UserNovelDataService.dll"]

View File

@@ -0,0 +1,27 @@
<Project Sdk="Microsoft.NET.Sdk.Web">
<PropertyGroup>
<TargetFramework>net8.0</TargetFramework>
<Nullable>enable</Nullable>
<ImplicitUsings>enable</ImplicitUsings>
<DockerDefaultTargetOS>Linux</DockerDefaultTargetOS>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Microsoft.EntityFrameworkCore.Design" Version="9.0.11">
<PrivateAssets>all</PrivateAssets>
<IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
</PackageReference>
</ItemGroup>
<ItemGroup>
<Content Include="..\.dockerignore">
<Link>.dockerignore</Link>
</Content>
</ItemGroup>
<ItemGroup>
<ProjectReference Include="..\FictionArchive.Service.Shared\FictionArchive.Service.Shared.csproj" />
</ItemGroup>
</Project>

View File

@@ -0,0 +1,109 @@
using System.Security.Claims;
using FictionArchive.Service.UserNovelDataService.Models.Database;
using FictionArchive.Service.UserNovelDataService.Models.DTOs;
using FictionArchive.Service.UserNovelDataService.Services;
using HotChocolate.Authorization;
using HotChocolate.Types;
using Microsoft.EntityFrameworkCore;
namespace FictionArchive.Service.UserNovelDataService.GraphQL;
public class Mutation
{
[Authorize]
[Error<InvalidOperationException>]
public async Task<BookmarkPayload> UpsertBookmark(
UserNovelDataServiceDbContext dbContext,
ClaimsPrincipal claimsPrincipal,
UpsertBookmarkInput input)
{
var oAuthProviderId = claimsPrincipal.FindFirst(ClaimTypes.NameIdentifier)?.Value;
if (string.IsNullOrEmpty(oAuthProviderId))
{
throw new InvalidOperationException("Unable to determine current user identity");
}
var user = await dbContext.Users
.FirstOrDefaultAsync(u => u.OAuthProviderId == oAuthProviderId);
if (user == null)
{
// Auto-create user if not exists
user = new User { OAuthProviderId = oAuthProviderId };
dbContext.Users.Add(user);
await dbContext.SaveChangesAsync();
}
var existingBookmark = await dbContext.Bookmarks
.FirstOrDefaultAsync(b => b.UserId == user.Id && b.ChapterId == input.ChapterId);
if (existingBookmark != null)
{
// Update existing
existingBookmark.Description = input.Description;
}
else
{
// Create new
existingBookmark = new Bookmark
{
UserId = user.Id,
NovelId = input.NovelId,
ChapterId = input.ChapterId,
Description = input.Description
};
dbContext.Bookmarks.Add(existingBookmark);
}
await dbContext.SaveChangesAsync();
return new BookmarkPayload
{
Success = true,
Bookmark = new BookmarkDto
{
Id = existingBookmark.Id,
ChapterId = existingBookmark.ChapterId,
NovelId = existingBookmark.NovelId,
Description = existingBookmark.Description,
CreatedTime = existingBookmark.CreatedTime
}
};
}
[Authorize]
[Error<InvalidOperationException>]
public async Task<BookmarkPayload> RemoveBookmark(
UserNovelDataServiceDbContext dbContext,
ClaimsPrincipal claimsPrincipal,
uint chapterId)
{
var oAuthProviderId = claimsPrincipal.FindFirst(ClaimTypes.NameIdentifier)?.Value;
if (string.IsNullOrEmpty(oAuthProviderId))
{
throw new InvalidOperationException("Unable to determine current user identity");
}
var user = await dbContext.Users
.AsNoTracking()
.FirstOrDefaultAsync(u => u.OAuthProviderId == oAuthProviderId);
if (user == null)
{
return new BookmarkPayload { Success = false };
}
var bookmark = await dbContext.Bookmarks
.FirstOrDefaultAsync(b => b.UserId == user.Id && b.ChapterId == chapterId);
if (bookmark == null)
{
return new BookmarkPayload { Success = false };
}
dbContext.Bookmarks.Remove(bookmark);
await dbContext.SaveChangesAsync();
return new BookmarkPayload { Success = true };
}
}

View File

@@ -0,0 +1,45 @@
using System.Security.Claims;
using FictionArchive.Service.UserNovelDataService.Models.DTOs;
using FictionArchive.Service.UserNovelDataService.Services;
using HotChocolate.Authorization;
using Microsoft.EntityFrameworkCore;
namespace FictionArchive.Service.UserNovelDataService.GraphQL;
public class Query
{
[Authorize]
public async Task<IQueryable<BookmarkDto>> GetBookmarks(
UserNovelDataServiceDbContext dbContext,
ClaimsPrincipal claimsPrincipal,
uint novelId)
{
var oAuthProviderId = claimsPrincipal.FindFirst(ClaimTypes.NameIdentifier)?.Value;
if (string.IsNullOrEmpty(oAuthProviderId))
{
return new List<BookmarkDto>().AsQueryable();
}
var user = await dbContext.Users
.AsNoTracking()
.FirstOrDefaultAsync(u => u.OAuthProviderId == oAuthProviderId);
if (user == null)
{
return new List<BookmarkDto>().AsQueryable();
}
return dbContext.Bookmarks
.AsNoTracking()
.Where(b => b.UserId == user.Id && b.NovelId == novelId)
.OrderByDescending(b => b.CreatedTime)
.Select(b => new BookmarkDto
{
Id = b.Id,
ChapterId = b.ChapterId,
NovelId = b.NovelId,
Description = b.Description,
CreatedTime = b.CreatedTime
});
}
}

View File

@@ -0,0 +1,99 @@
// <auto-generated />
using System;
using FictionArchive.Service.UserNovelDataService.Services;
using Microsoft.EntityFrameworkCore;
using Microsoft.EntityFrameworkCore.Infrastructure;
using Microsoft.EntityFrameworkCore.Migrations;
using Microsoft.EntityFrameworkCore.Storage.ValueConversion;
using NodaTime;
using Npgsql.EntityFrameworkCore.PostgreSQL.Metadata;
#nullable disable
namespace FictionArchive.Service.UserNovelDataService.Migrations
{
[DbContext(typeof(UserNovelDataServiceDbContext))]
[Migration("20251230181559_AddBookmarks")]
partial class AddBookmarks
{
/// <inheritdoc />
protected override void BuildTargetModel(ModelBuilder modelBuilder)
{
#pragma warning disable 612, 618
modelBuilder
.HasAnnotation("ProductVersion", "9.0.11")
.HasAnnotation("Relational:MaxIdentifierLength", 63);
NpgsqlModelBuilderExtensions.UseIdentityByDefaultColumns(modelBuilder);
modelBuilder.Entity("FictionArchive.Service.UserNovelDataService.Models.Database.Bookmark", b =>
{
b.Property<int>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("integer");
NpgsqlPropertyBuilderExtensions.UseIdentityByDefaultColumn(b.Property<int>("Id"));
b.Property<long>("ChapterId")
.HasColumnType("bigint");
b.Property<Instant>("CreatedTime")
.HasColumnType("timestamp with time zone");
b.Property<string>("Description")
.HasColumnType("text");
b.Property<Instant>("LastUpdatedTime")
.HasColumnType("timestamp with time zone");
b.Property<long>("NovelId")
.HasColumnType("bigint");
b.Property<Guid>("UserId")
.HasColumnType("uuid");
b.HasKey("Id");
b.HasIndex("UserId", "ChapterId")
.IsUnique();
b.HasIndex("UserId", "NovelId");
b.ToTable("Bookmarks");
});
modelBuilder.Entity("FictionArchive.Service.UserNovelDataService.Models.Database.User", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uuid");
b.Property<Instant>("CreatedTime")
.HasColumnType("timestamp with time zone");
b.Property<Instant>("LastUpdatedTime")
.HasColumnType("timestamp with time zone");
b.Property<string>("OAuthProviderId")
.IsRequired()
.HasColumnType("text");
b.HasKey("Id");
b.ToTable("Users");
});
modelBuilder.Entity("FictionArchive.Service.UserNovelDataService.Models.Database.Bookmark", b =>
{
b.HasOne("FictionArchive.Service.UserNovelDataService.Models.Database.User", "User")
.WithMany()
.HasForeignKey("UserId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
b.Navigation("User");
});
#pragma warning restore 612, 618
}
}
}

View File

@@ -0,0 +1,76 @@
using System;
using Microsoft.EntityFrameworkCore.Migrations;
using NodaTime;
using Npgsql.EntityFrameworkCore.PostgreSQL.Metadata;
#nullable disable
namespace FictionArchive.Service.UserNovelDataService.Migrations
{
/// <inheritdoc />
public partial class AddBookmarks : Migration
{
/// <inheritdoc />
protected override void Up(MigrationBuilder migrationBuilder)
{
migrationBuilder.CreateTable(
name: "Users",
columns: table => new
{
Id = table.Column<Guid>(type: "uuid", nullable: false),
OAuthProviderId = table.Column<string>(type: "text", nullable: false),
CreatedTime = table.Column<Instant>(type: "timestamp with time zone", nullable: false),
LastUpdatedTime = table.Column<Instant>(type: "timestamp with time zone", nullable: false)
},
constraints: table =>
{
table.PrimaryKey("PK_Users", x => x.Id);
});
migrationBuilder.CreateTable(
name: "Bookmarks",
columns: table => new
{
Id = table.Column<int>(type: "integer", nullable: false)
.Annotation("Npgsql:ValueGenerationStrategy", NpgsqlValueGenerationStrategy.IdentityByDefaultColumn),
UserId = table.Column<Guid>(type: "uuid", nullable: false),
ChapterId = table.Column<long>(type: "bigint", nullable: false),
NovelId = table.Column<long>(type: "bigint", nullable: false),
Description = table.Column<string>(type: "text", nullable: true),
CreatedTime = table.Column<Instant>(type: "timestamp with time zone", nullable: false),
LastUpdatedTime = table.Column<Instant>(type: "timestamp with time zone", nullable: false)
},
constraints: table =>
{
table.PrimaryKey("PK_Bookmarks", x => x.Id);
table.ForeignKey(
name: "FK_Bookmarks_Users_UserId",
column: x => x.UserId,
principalTable: "Users",
principalColumn: "Id",
onDelete: ReferentialAction.Cascade);
});
migrationBuilder.CreateIndex(
name: "IX_Bookmarks_UserId_ChapterId",
table: "Bookmarks",
columns: new[] { "UserId", "ChapterId" },
unique: true);
migrationBuilder.CreateIndex(
name: "IX_Bookmarks_UserId_NovelId",
table: "Bookmarks",
columns: new[] { "UserId", "NovelId" });
}
/// <inheritdoc />
protected override void Down(MigrationBuilder migrationBuilder)
{
migrationBuilder.DropTable(
name: "Bookmarks");
migrationBuilder.DropTable(
name: "Users");
}
}
}

View File

@@ -0,0 +1,198 @@
// <auto-generated />
using System;
using FictionArchive.Service.UserNovelDataService.Services;
using Microsoft.EntityFrameworkCore;
using Microsoft.EntityFrameworkCore.Infrastructure;
using Microsoft.EntityFrameworkCore.Migrations;
using Microsoft.EntityFrameworkCore.Storage.ValueConversion;
using NodaTime;
using Npgsql.EntityFrameworkCore.PostgreSQL.Metadata;
#nullable disable
namespace FictionArchive.Service.UserNovelDataService.Migrations
{
[DbContext(typeof(UserNovelDataServiceDbContext))]
[Migration("20260119184741_AddNovelVolumeChapter")]
partial class AddNovelVolumeChapter
{
/// <inheritdoc />
protected override void BuildTargetModel(ModelBuilder modelBuilder)
{
#pragma warning disable 612, 618
modelBuilder
.HasAnnotation("ProductVersion", "9.0.11")
.HasAnnotation("Relational:MaxIdentifierLength", 63);
NpgsqlModelBuilderExtensions.UseIdentityByDefaultColumns(modelBuilder);
modelBuilder.Entity("FictionArchive.Service.UserNovelDataService.Models.Database.Bookmark", b =>
{
b.Property<int>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("integer");
NpgsqlPropertyBuilderExtensions.UseIdentityByDefaultColumn(b.Property<int>("Id"));
b.Property<long>("ChapterId")
.HasColumnType("bigint");
b.Property<Instant>("CreatedTime")
.HasColumnType("timestamp with time zone");
b.Property<string>("Description")
.HasColumnType("text");
b.Property<Instant>("LastUpdatedTime")
.HasColumnType("timestamp with time zone");
b.Property<long>("NovelId")
.HasColumnType("bigint");
b.Property<Guid>("UserId")
.HasColumnType("uuid");
b.HasKey("Id");
b.HasIndex("UserId", "ChapterId")
.IsUnique();
b.HasIndex("UserId", "NovelId");
b.ToTable("Bookmarks");
});
modelBuilder.Entity("FictionArchive.Service.UserNovelDataService.Models.Database.Chapter", b =>
{
b.Property<long>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("bigint");
NpgsqlPropertyBuilderExtensions.UseIdentityByDefaultColumn(b.Property<long>("Id"));
b.Property<Instant>("CreatedTime")
.HasColumnType("timestamp with time zone");
b.Property<Instant>("LastUpdatedTime")
.HasColumnType("timestamp with time zone");
b.Property<long>("VolumeId")
.HasColumnType("bigint");
b.HasKey("Id");
b.HasIndex("VolumeId");
b.ToTable("Chapters");
});
modelBuilder.Entity("FictionArchive.Service.UserNovelDataService.Models.Database.Novel", b =>
{
b.Property<long>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("bigint");
NpgsqlPropertyBuilderExtensions.UseIdentityByDefaultColumn(b.Property<long>("Id"));
b.Property<Instant>("CreatedTime")
.HasColumnType("timestamp with time zone");
b.Property<Instant>("LastUpdatedTime")
.HasColumnType("timestamp with time zone");
b.HasKey("Id");
b.ToTable("Novels");
});
modelBuilder.Entity("FictionArchive.Service.UserNovelDataService.Models.Database.User", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uuid");
b.Property<Instant>("CreatedTime")
.HasColumnType("timestamp with time zone");
b.Property<Instant>("LastUpdatedTime")
.HasColumnType("timestamp with time zone");
b.Property<string>("OAuthProviderId")
.IsRequired()
.HasColumnType("text");
b.HasKey("Id");
b.ToTable("Users");
});
modelBuilder.Entity("FictionArchive.Service.UserNovelDataService.Models.Database.Volume", b =>
{
b.Property<long>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("bigint");
NpgsqlPropertyBuilderExtensions.UseIdentityByDefaultColumn(b.Property<long>("Id"));
b.Property<Instant>("CreatedTime")
.HasColumnType("timestamp with time zone");
b.Property<Instant>("LastUpdatedTime")
.HasColumnType("timestamp with time zone");
b.Property<long>("NovelId")
.HasColumnType("bigint");
b.HasKey("Id");
b.HasIndex("NovelId");
b.ToTable("Volumes");
});
modelBuilder.Entity("FictionArchive.Service.UserNovelDataService.Models.Database.Bookmark", b =>
{
b.HasOne("FictionArchive.Service.UserNovelDataService.Models.Database.User", "User")
.WithMany()
.HasForeignKey("UserId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
b.Navigation("User");
});
modelBuilder.Entity("FictionArchive.Service.UserNovelDataService.Models.Database.Chapter", b =>
{
b.HasOne("FictionArchive.Service.UserNovelDataService.Models.Database.Volume", "Volume")
.WithMany("Chapters")
.HasForeignKey("VolumeId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
b.Navigation("Volume");
});
modelBuilder.Entity("FictionArchive.Service.UserNovelDataService.Models.Database.Volume", b =>
{
b.HasOne("FictionArchive.Service.UserNovelDataService.Models.Database.Novel", "Novel")
.WithMany("Volumes")
.HasForeignKey("NovelId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
b.Navigation("Novel");
});
modelBuilder.Entity("FictionArchive.Service.UserNovelDataService.Models.Database.Novel", b =>
{
b.Navigation("Volumes");
});
modelBuilder.Entity("FictionArchive.Service.UserNovelDataService.Models.Database.Volume", b =>
{
b.Navigation("Chapters");
});
#pragma warning restore 612, 618
}
}
}

View File

@@ -0,0 +1,95 @@
using Microsoft.EntityFrameworkCore.Migrations;
using NodaTime;
using Npgsql.EntityFrameworkCore.PostgreSQL.Metadata;
#nullable disable
namespace FictionArchive.Service.UserNovelDataService.Migrations
{
/// <inheritdoc />
public partial class AddNovelVolumeChapter : Migration
{
/// <inheritdoc />
protected override void Up(MigrationBuilder migrationBuilder)
{
migrationBuilder.CreateTable(
name: "Novels",
columns: table => new
{
Id = table.Column<long>(type: "bigint", nullable: false)
.Annotation("Npgsql:ValueGenerationStrategy", NpgsqlValueGenerationStrategy.IdentityByDefaultColumn),
CreatedTime = table.Column<Instant>(type: "timestamp with time zone", nullable: false),
LastUpdatedTime = table.Column<Instant>(type: "timestamp with time zone", nullable: false)
},
constraints: table =>
{
table.PrimaryKey("PK_Novels", x => x.Id);
});
migrationBuilder.CreateTable(
name: "Volumes",
columns: table => new
{
Id = table.Column<long>(type: "bigint", nullable: false)
.Annotation("Npgsql:ValueGenerationStrategy", NpgsqlValueGenerationStrategy.IdentityByDefaultColumn),
NovelId = table.Column<long>(type: "bigint", nullable: false),
CreatedTime = table.Column<Instant>(type: "timestamp with time zone", nullable: false),
LastUpdatedTime = table.Column<Instant>(type: "timestamp with time zone", nullable: false)
},
constraints: table =>
{
table.PrimaryKey("PK_Volumes", x => x.Id);
table.ForeignKey(
name: "FK_Volumes_Novels_NovelId",
column: x => x.NovelId,
principalTable: "Novels",
principalColumn: "Id",
onDelete: ReferentialAction.Cascade);
});
migrationBuilder.CreateTable(
name: "Chapters",
columns: table => new
{
Id = table.Column<long>(type: "bigint", nullable: false)
.Annotation("Npgsql:ValueGenerationStrategy", NpgsqlValueGenerationStrategy.IdentityByDefaultColumn),
VolumeId = table.Column<long>(type: "bigint", nullable: false),
CreatedTime = table.Column<Instant>(type: "timestamp with time zone", nullable: false),
LastUpdatedTime = table.Column<Instant>(type: "timestamp with time zone", nullable: false)
},
constraints: table =>
{
table.PrimaryKey("PK_Chapters", x => x.Id);
table.ForeignKey(
name: "FK_Chapters_Volumes_VolumeId",
column: x => x.VolumeId,
principalTable: "Volumes",
principalColumn: "Id",
onDelete: ReferentialAction.Cascade);
});
migrationBuilder.CreateIndex(
name: "IX_Chapters_VolumeId",
table: "Chapters",
column: "VolumeId");
migrationBuilder.CreateIndex(
name: "IX_Volumes_NovelId",
table: "Volumes",
column: "NovelId");
}
/// <inheritdoc />
protected override void Down(MigrationBuilder migrationBuilder)
{
migrationBuilder.DropTable(
name: "Chapters");
migrationBuilder.DropTable(
name: "Volumes");
migrationBuilder.DropTable(
name: "Novels");
}
}
}

View File

@@ -0,0 +1,195 @@
// <auto-generated />
using System;
using FictionArchive.Service.UserNovelDataService.Services;
using Microsoft.EntityFrameworkCore;
using Microsoft.EntityFrameworkCore.Infrastructure;
using Microsoft.EntityFrameworkCore.Storage.ValueConversion;
using NodaTime;
using Npgsql.EntityFrameworkCore.PostgreSQL.Metadata;
#nullable disable
namespace FictionArchive.Service.UserNovelDataService.Migrations
{
[DbContext(typeof(UserNovelDataServiceDbContext))]
partial class UserNovelDataServiceDbContextModelSnapshot : ModelSnapshot
{
protected override void BuildModel(ModelBuilder modelBuilder)
{
#pragma warning disable 612, 618
modelBuilder
.HasAnnotation("ProductVersion", "9.0.11")
.HasAnnotation("Relational:MaxIdentifierLength", 63);
NpgsqlModelBuilderExtensions.UseIdentityByDefaultColumns(modelBuilder);
modelBuilder.Entity("FictionArchive.Service.UserNovelDataService.Models.Database.Bookmark", b =>
{
b.Property<int>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("integer");
NpgsqlPropertyBuilderExtensions.UseIdentityByDefaultColumn(b.Property<int>("Id"));
b.Property<long>("ChapterId")
.HasColumnType("bigint");
b.Property<Instant>("CreatedTime")
.HasColumnType("timestamp with time zone");
b.Property<string>("Description")
.HasColumnType("text");
b.Property<Instant>("LastUpdatedTime")
.HasColumnType("timestamp with time zone");
b.Property<long>("NovelId")
.HasColumnType("bigint");
b.Property<Guid>("UserId")
.HasColumnType("uuid");
b.HasKey("Id");
b.HasIndex("UserId", "ChapterId")
.IsUnique();
b.HasIndex("UserId", "NovelId");
b.ToTable("Bookmarks");
});
modelBuilder.Entity("FictionArchive.Service.UserNovelDataService.Models.Database.Chapter", b =>
{
b.Property<long>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("bigint");
NpgsqlPropertyBuilderExtensions.UseIdentityByDefaultColumn(b.Property<long>("Id"));
b.Property<Instant>("CreatedTime")
.HasColumnType("timestamp with time zone");
b.Property<Instant>("LastUpdatedTime")
.HasColumnType("timestamp with time zone");
b.Property<long>("VolumeId")
.HasColumnType("bigint");
b.HasKey("Id");
b.HasIndex("VolumeId");
b.ToTable("Chapters");
});
modelBuilder.Entity("FictionArchive.Service.UserNovelDataService.Models.Database.Novel", b =>
{
b.Property<long>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("bigint");
NpgsqlPropertyBuilderExtensions.UseIdentityByDefaultColumn(b.Property<long>("Id"));
b.Property<Instant>("CreatedTime")
.HasColumnType("timestamp with time zone");
b.Property<Instant>("LastUpdatedTime")
.HasColumnType("timestamp with time zone");
b.HasKey("Id");
b.ToTable("Novels");
});
modelBuilder.Entity("FictionArchive.Service.UserNovelDataService.Models.Database.User", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uuid");
b.Property<Instant>("CreatedTime")
.HasColumnType("timestamp with time zone");
b.Property<Instant>("LastUpdatedTime")
.HasColumnType("timestamp with time zone");
b.Property<string>("OAuthProviderId")
.IsRequired()
.HasColumnType("text");
b.HasKey("Id");
b.ToTable("Users");
});
modelBuilder.Entity("FictionArchive.Service.UserNovelDataService.Models.Database.Volume", b =>
{
b.Property<long>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("bigint");
NpgsqlPropertyBuilderExtensions.UseIdentityByDefaultColumn(b.Property<long>("Id"));
b.Property<Instant>("CreatedTime")
.HasColumnType("timestamp with time zone");
b.Property<Instant>("LastUpdatedTime")
.HasColumnType("timestamp with time zone");
b.Property<long>("NovelId")
.HasColumnType("bigint");
b.HasKey("Id");
b.HasIndex("NovelId");
b.ToTable("Volumes");
});
modelBuilder.Entity("FictionArchive.Service.UserNovelDataService.Models.Database.Bookmark", b =>
{
b.HasOne("FictionArchive.Service.UserNovelDataService.Models.Database.User", "User")
.WithMany()
.HasForeignKey("UserId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
b.Navigation("User");
});
modelBuilder.Entity("FictionArchive.Service.UserNovelDataService.Models.Database.Chapter", b =>
{
b.HasOne("FictionArchive.Service.UserNovelDataService.Models.Database.Volume", "Volume")
.WithMany("Chapters")
.HasForeignKey("VolumeId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
b.Navigation("Volume");
});
modelBuilder.Entity("FictionArchive.Service.UserNovelDataService.Models.Database.Volume", b =>
{
b.HasOne("FictionArchive.Service.UserNovelDataService.Models.Database.Novel", "Novel")
.WithMany("Volumes")
.HasForeignKey("NovelId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
b.Navigation("Novel");
});
modelBuilder.Entity("FictionArchive.Service.UserNovelDataService.Models.Database.Novel", b =>
{
b.Navigation("Volumes");
});
modelBuilder.Entity("FictionArchive.Service.UserNovelDataService.Models.Database.Volume", b =>
{
b.Navigation("Chapters");
});
#pragma warning restore 612, 618
}
}
}

View File

@@ -0,0 +1,12 @@
using NodaTime;
namespace FictionArchive.Service.UserNovelDataService.Models.DTOs;
public class BookmarkDto
{
public int Id { get; init; }
public uint ChapterId { get; init; }
public uint NovelId { get; init; }
public string? Description { get; init; }
public Instant CreatedTime { get; init; }
}

View File

@@ -0,0 +1,7 @@
namespace FictionArchive.Service.UserNovelDataService.Models.DTOs;
public class BookmarkPayload
{
public BookmarkDto? Bookmark { get; init; }
public bool Success { get; init; }
}

View File

@@ -0,0 +1,3 @@
namespace FictionArchive.Service.UserNovelDataService.Models.DTOs;
public record UpsertBookmarkInput(uint NovelId, uint ChapterId, string? Description);

View File

@@ -0,0 +1,14 @@
using FictionArchive.Service.Shared.Models;
namespace FictionArchive.Service.UserNovelDataService.Models.Database;
public class Bookmark : BaseEntity<int>
{
public Guid UserId { get; set; }
public virtual User User { get; set; } = null!;
public uint ChapterId { get; set; }
public uint NovelId { get; set; }
public string? Description { get; set; }
}

View File

@@ -0,0 +1,9 @@
using FictionArchive.Service.Shared.Models;
namespace FictionArchive.Service.UserNovelDataService.Models.Database;
public class Chapter : BaseEntity<uint>
{
public uint VolumeId { get; set; }
public virtual Volume Volume { get; set; } = null!;
}

View File

@@ -0,0 +1,8 @@
using FictionArchive.Service.Shared.Models;
namespace FictionArchive.Service.UserNovelDataService.Models.Database;
public class Novel : BaseEntity<uint>
{
public virtual ICollection<Volume> Volumes { get; set; } = new List<Volume>();
}

View File

@@ -0,0 +1,8 @@
using FictionArchive.Service.Shared.Models;
namespace FictionArchive.Service.UserNovelDataService.Models.Database;
public class User : BaseEntity<Guid>
{
public required string OAuthProviderId { get; set; }
}

View File

@@ -0,0 +1,10 @@
using FictionArchive.Service.Shared.Models;
namespace FictionArchive.Service.UserNovelDataService.Models.Database;
public class Volume : BaseEntity<uint>
{
public uint NovelId { get; set; }
public virtual Novel Novel { get; set; } = null!;
public virtual ICollection<Chapter> Chapters { get; set; } = new List<Chapter>();
}

View File

@@ -0,0 +1,13 @@
using FictionArchive.Service.Shared.Services.EventBus;
namespace FictionArchive.Service.UserNovelDataService.Models.IntegrationEvents;
public class ChapterCreatedEvent : IIntegrationEvent
{
public required uint ChapterId { get; init; }
public required uint NovelId { get; init; }
public required uint VolumeId { get; init; }
public required int VolumeOrder { get; init; }
public required uint ChapterOrder { get; init; }
public required string ChapterTitle { get; init; }
}

View File

@@ -0,0 +1,13 @@
using FictionArchive.Common.Enums;
using FictionArchive.Service.Shared.Services.EventBus;
namespace FictionArchive.Service.UserNovelDataService.Models.IntegrationEvents;
public class NovelCreatedEvent : IIntegrationEvent
{
public required uint NovelId { get; init; }
public required string Title { get; init; }
public required Language OriginalLanguage { get; init; }
public required string Source { get; init; }
public required string AuthorName { get; init; }
}

View File

@@ -0,0 +1,15 @@
using FictionArchive.Service.Shared.Services.EventBus;
namespace FictionArchive.Service.UserNovelDataService.Models.IntegrationEvents;
public class UserInvitedEvent : IIntegrationEvent
{
public Guid InvitedUserId { get; set; }
public required string InvitedUsername { get; set; }
public required string InvitedEmail { get; set; }
public required string InvitedOAuthProviderId { get; set; }
public Guid InviterId { get; set; }
public required string InviterUsername { get; set; }
public required string InviterOAuthProviderId { get; set; }
}

View File

@@ -0,0 +1,80 @@
using FictionArchive.Common.Extensions;
using FictionArchive.Service.Shared;
using FictionArchive.Service.Shared.Extensions;
using FictionArchive.Service.Shared.Services.EventBus.Implementations;
using FictionArchive.Service.UserNovelDataService.GraphQL;
using FictionArchive.Service.UserNovelDataService.Models.IntegrationEvents;
using FictionArchive.Service.UserNovelDataService.Services;
using FictionArchive.Service.UserNovelDataService.Services.EventHandlers;
namespace FictionArchive.Service.UserNovelDataService;
public class Program
{
public static void Main(string[] args)
{
var builder = WebApplication.CreateBuilder(args);
var isSchemaExport = SchemaExportDetector.IsSchemaExportMode(args);
builder.AddLocalAppsettings();
builder.Services.AddMemoryCache();
builder.Services.AddHealthChecks();
#region Event Bus
if (!isSchemaExport)
{
builder.Services.AddRabbitMQ(opt =>
{
builder.Configuration.GetSection("RabbitMQ").Bind(opt);
})
.Subscribe<NovelCreatedEvent, NovelCreatedEventHandler>()
.Subscribe<ChapterCreatedEvent, ChapterCreatedEventHandler>()
.Subscribe<UserInvitedEvent, UserInvitedEventHandler>();
}
#endregion
#region GraphQL
builder.Services.AddDefaultGraphQl<Query, Mutation>()
.AddAuthorization();
#endregion
#region Database
builder.Services.RegisterDbContext<UserNovelDataServiceDbContext>(
builder.Configuration.GetConnectionString("DefaultConnection"),
skipInfrastructure: isSchemaExport);
#endregion
// Authentication & Authorization
builder.Services.AddOidcAuthentication(builder.Configuration);
builder.Services.AddFictionArchiveAuthorization();
var app = builder.Build();
// Update database (skip in schema export mode)
if (!isSchemaExport)
{
using var scope = app.Services.CreateScope();
var dbContext = scope.ServiceProvider.GetRequiredService<UserNovelDataServiceDbContext>();
dbContext.UpdateDatabase();
}
app.UseHttpsRedirection();
app.MapHealthChecks("/healthz");
app.UseAuthentication();
app.UseAuthorization();
app.MapGraphQL();
app.RunWithGraphQLCommands(args);
}
}

View File

@@ -0,0 +1,38 @@
{
"$schema": "http://json.schemastore.org/launchsettings.json",
"iisSettings": {
"windowsAuthentication": false,
"anonymousAuthentication": true,
"iisExpress": {
"applicationUrl": "http://localhost:26318",
"sslPort": 44303
}
},
"profiles": {
"http": {
"commandName": "Project",
"dotnetRunMessages": true,
"launchBrowser": true,
"applicationUrl": "http://localhost:5130",
"environmentVariables": {
"ASPNETCORE_ENVIRONMENT": "Development"
}
},
"https": {
"commandName": "Project",
"dotnetRunMessages": true,
"launchBrowser": true,
"applicationUrl": "https://localhost:7298;http://localhost:5130",
"environmentVariables": {
"ASPNETCORE_ENVIRONMENT": "Development"
}
},
"IIS Express": {
"commandName": "IISExpress",
"launchBrowser": true,
"environmentVariables": {
"ASPNETCORE_ENVIRONMENT": "Development"
}
}
}
}

View File

@@ -0,0 +1,93 @@
# UserNovelDataService Backfill Scripts
SQL scripts for backfilling data from UserService and NovelService into UserNovelDataService.
## Prerequisites
1. **Run EF migrations** on the UserNovelDataService database to ensure all tables exist:
```bash
dotnet ef database update --project FictionArchive.Service.UserNovelDataService
```
This will apply the `AddNovelVolumeChapter` migration which creates:
- `Novels` table (Id, CreatedTime, LastUpdatedTime)
- `Volumes` table (Id, NovelId FK, CreatedTime, LastUpdatedTime)
- `Chapters` table (Id, VolumeId FK, CreatedTime, LastUpdatedTime)
## Execution Order
Run scripts in numeric order:
### Extraction (run against source databases)
1. `01_extract_users_from_userservice.sql` - Run against **UserService** DB
2. `02_extract_novels_from_novelservice.sql` - Run against **NovelService** DB
3. `03_extract_volumes_from_novelservice.sql` - Run against **NovelService** DB
4. `04_extract_chapters_from_novelservice.sql` - Run against **NovelService** DB
### Insertion (run against UserNovelDataService database)
5. `05_insert_users_to_usernoveldataservice.sql`
6. `06_insert_novels_to_usernoveldataservice.sql`
7. `07_insert_volumes_to_usernoveldataservice.sql`
8. `08_insert_chapters_to_usernoveldataservice.sql`
## Methods
Each script provides three options:
1. **SELECT for review** - Review data before export
2. **Generate INSERT statements** - Creates individual INSERT statements (good for small datasets)
3. **CSV export/import** - Use PostgreSQL `\copy` for bulk operations (recommended for large datasets)
## Example Workflow
### Using CSV Export/Import (Recommended)
```bash
# 1. Export from source databases
psql -h localhost -U postgres -d userservice -c "\copy (SELECT \"Id\", \"OAuthProviderId\", \"CreatedTime\", \"LastUpdatedTime\" FROM \"Users\" WHERE \"Disabled\" = false) TO '/tmp/users_export.csv' WITH CSV HEADER"
psql -h localhost -U postgres -d novelservice -c "\copy (SELECT \"Id\", \"CreatedTime\", \"LastUpdatedTime\" FROM \"Novels\") TO '/tmp/novels_export.csv' WITH CSV HEADER"
psql -h localhost -U postgres -d novelservice -c "\copy (SELECT \"Id\", \"NovelId\", \"CreatedTime\", \"LastUpdatedTime\" FROM \"Volume\" ORDER BY \"NovelId\", \"Id\") TO '/tmp/volumes_export.csv' WITH CSV HEADER"
psql -h localhost -U postgres -d novelservice -c "\copy (SELECT \"Id\", \"VolumeId\", \"CreatedTime\", \"LastUpdatedTime\" FROM \"Chapter\" ORDER BY \"VolumeId\", \"Id\") TO '/tmp/chapters_export.csv' WITH CSV HEADER"
# 2. Import into UserNovelDataService (order matters due to FK constraints!)
psql -h localhost -U postgres -d usernoveldataservice -c "\copy \"Users\" (\"Id\", \"OAuthProviderId\", \"CreatedTime\", \"LastUpdatedTime\") FROM '/tmp/users_export.csv' WITH CSV HEADER"
psql -h localhost -U postgres -d usernoveldataservice -c "\copy \"Novels\" (\"Id\", \"CreatedTime\", \"LastUpdatedTime\") FROM '/tmp/novels_export.csv' WITH CSV HEADER"
psql -h localhost -U postgres -d usernoveldataservice -c "\copy \"Volumes\" (\"Id\", \"NovelId\", \"CreatedTime\", \"LastUpdatedTime\") FROM '/tmp/volumes_export.csv' WITH CSV HEADER"
psql -h localhost -U postgres -d usernoveldataservice -c "\copy \"Chapters\" (\"Id\", \"VolumeId\", \"CreatedTime\", \"LastUpdatedTime\") FROM '/tmp/chapters_export.csv' WITH CSV HEADER"
```
**Important**: Insert order matters due to foreign key constraints:
1. Users (no dependencies)
2. Novels (no dependencies)
3. Volumes (depends on Novels)
4. Chapters (depends on Volumes)
### Using dblink (Cross-database queries)
If both databases are on the same PostgreSQL server, you can use `dblink` extension for direct cross-database inserts. See the commented examples in each insert script.
## Verification
After running the backfill, verify counts match:
```sql
-- Run on UserService DB
SELECT COUNT(*) as user_count FROM "Users" WHERE "Disabled" = false;
-- Run on NovelService DB
SELECT COUNT(*) as novel_count FROM "Novels";
SELECT COUNT(*) as volume_count FROM "Volume";
SELECT COUNT(*) as chapter_count FROM "Chapter";
-- Run on UserNovelDataService DB
SELECT COUNT(*) as user_count FROM "Users";
SELECT COUNT(*) as novel_count FROM "Novels";
SELECT COUNT(*) as volume_count FROM "Volumes";
SELECT COUNT(*) as chapter_count FROM "Chapters";
```

View File

@@ -0,0 +1,28 @@
-- Extract Users from UserService database
-- Run this against: UserService PostgreSQL database
-- Output: CSV or use COPY TO for bulk export
-- Option 1: Simple SELECT for review/testing
SELECT
"Id",
"OAuthProviderId",
"CreatedTime",
"LastUpdatedTime"
FROM "Users"
WHERE "Disabled" = false
ORDER BY "CreatedTime";
-- Option 2: Generate INSERT statements (useful for small datasets)
SELECT format(
'INSERT INTO "Users" ("Id", "OAuthProviderId", "CreatedTime", "LastUpdatedTime") VALUES (%L, %L, %L, %L) ON CONFLICT ("Id") DO NOTHING;',
"Id",
"OAuthProviderId",
"CreatedTime",
"LastUpdatedTime"
)
FROM "Users"
WHERE "Disabled" = false
ORDER BY "CreatedTime";
-- Option 3: Export to CSV (run from psql)
-- \copy (SELECT "Id", "OAuthProviderId", "CreatedTime", "LastUpdatedTime" FROM "Users" WHERE "Disabled" = false ORDER BY "CreatedTime") TO '/tmp/users_export.csv' WITH CSV HEADER;

View File

@@ -0,0 +1,24 @@
-- Extract Novels from NovelService database
-- Run this against: NovelService PostgreSQL database
-- Output: CSV or use COPY TO for bulk export
-- Option 1: Simple SELECT for review/testing
SELECT
"Id",
"CreatedTime",
"LastUpdatedTime"
FROM "Novels"
ORDER BY "Id";
-- Option 2: Generate INSERT statements
SELECT format(
'INSERT INTO "Novels" ("Id", "CreatedTime", "LastUpdatedTime") VALUES (%s, %L, %L) ON CONFLICT ("Id") DO NOTHING;',
"Id",
"CreatedTime",
"LastUpdatedTime"
)
FROM "Novels"
ORDER BY "Id";
-- Option 3: Export to CSV (run from psql)
-- \copy (SELECT "Id", "CreatedTime", "LastUpdatedTime" FROM "Novels" ORDER BY "Id") TO '/tmp/novels_export.csv' WITH CSV HEADER;

View File

@@ -0,0 +1,26 @@
-- Extract Volumes from NovelService database
-- Run this against: NovelService PostgreSQL database
-- Output: CSV or use COPY TO for bulk export
-- Option 1: Simple SELECT for review/testing
SELECT
"Id",
"NovelId",
"CreatedTime",
"LastUpdatedTime"
FROM "Volume"
ORDER BY "NovelId", "Id";
-- Option 2: Generate INSERT statements
SELECT format(
'INSERT INTO "Volumes" ("Id", "NovelId", "CreatedTime", "LastUpdatedTime") VALUES (%s, %s, %L, %L) ON CONFLICT ("Id") DO NOTHING;',
"Id",
"NovelId",
"CreatedTime",
"LastUpdatedTime"
)
FROM "Volume"
ORDER BY "NovelId", "Id";
-- Option 3: Export to CSV (run from psql)
-- \copy (SELECT "Id", "NovelId", "CreatedTime", "LastUpdatedTime" FROM "Volume" ORDER BY "NovelId", "Id") TO '/tmp/volumes_export.csv' WITH CSV HEADER;

View File

@@ -0,0 +1,26 @@
-- Extract Chapters from NovelService database
-- Run this against: NovelService PostgreSQL database
-- Output: CSV or use COPY TO for bulk export
-- Option 1: Simple SELECT for review/testing
SELECT
"Id",
"VolumeId",
"CreatedTime",
"LastUpdatedTime"
FROM "Chapter"
ORDER BY "VolumeId", "Id";
-- Option 2: Generate INSERT statements
SELECT format(
'INSERT INTO "Chapters" ("Id", "VolumeId", "CreatedTime", "LastUpdatedTime") VALUES (%s, %s, %L, %L) ON CONFLICT ("Id") DO NOTHING;',
"Id",
"VolumeId",
"CreatedTime",
"LastUpdatedTime"
)
FROM "Chapter"
ORDER BY "VolumeId", "Id";
-- Option 3: Export to CSV (run from psql)
-- \copy (SELECT "Id", "VolumeId", "CreatedTime", "LastUpdatedTime" FROM "Chapter" ORDER BY "VolumeId", "Id") TO '/tmp/chapters_export.csv' WITH CSV HEADER;

View File

@@ -0,0 +1,32 @@
-- Insert Users into UserNovelDataService database
-- Run this against: UserNovelDataService PostgreSQL database
--
-- PREREQUISITE: You must have extracted users from UserService first
-- using 01_extract_users_from_userservice.sql
-- Option 1: If you have a CSV file from export
-- \copy "Users" ("Id", "OAuthProviderId", "CreatedTime", "LastUpdatedTime") FROM '/tmp/users_export.csv' WITH CSV HEADER;
-- Option 2: Direct cross-database insert using dblink
-- First, install dblink extension if not already done:
-- CREATE EXTENSION IF NOT EXISTS dblink;
-- Example using dblink (adjust connection string):
/*
INSERT INTO "Users" ("Id", "OAuthProviderId", "CreatedTime", "LastUpdatedTime")
SELECT
"Id"::uuid,
"OAuthProviderId",
"CreatedTime"::timestamp with time zone,
"LastUpdatedTime"::timestamp with time zone
FROM dblink(
'host=localhost port=5432 dbname=userservice user=postgres password=yourpassword',
'SELECT "Id", "OAuthProviderId", "CreatedTime", "LastUpdatedTime" FROM "Users" WHERE "Disabled" = false'
) AS t("Id" uuid, "OAuthProviderId" text, "CreatedTime" timestamp with time zone, "LastUpdatedTime" timestamp with time zone)
ON CONFLICT ("Id") DO UPDATE SET
"OAuthProviderId" = EXCLUDED."OAuthProviderId",
"LastUpdatedTime" = EXCLUDED."LastUpdatedTime";
*/
-- Option 3: Paste generated INSERT statements from extraction script here
-- INSERT INTO "Users" ("Id", "OAuthProviderId", "CreatedTime", "LastUpdatedTime") VALUES (...) ON CONFLICT ("Id") DO NOTHING;

View File

@@ -0,0 +1,31 @@
-- Insert Novels into UserNovelDataService database
-- Run this against: UserNovelDataService PostgreSQL database
--
-- PREREQUISITE:
-- 1. Ensure the Novels table exists (run EF migrations first if needed)
-- 2. Extract novels from NovelService using 02_extract_novels_from_novelservice.sql
-- Option 1: If you have a CSV file from export
-- \copy "Novels" ("Id", "CreatedTime", "LastUpdatedTime") FROM '/tmp/novels_export.csv' WITH CSV HEADER;
-- Option 2: Direct cross-database insert using dblink
-- First, install dblink extension if not already done:
-- CREATE EXTENSION IF NOT EXISTS dblink;
-- Example using dblink (adjust connection string):
/*
INSERT INTO "Novels" ("Id", "CreatedTime", "LastUpdatedTime")
SELECT
"Id"::bigint,
"CreatedTime"::timestamp with time zone,
"LastUpdatedTime"::timestamp with time zone
FROM dblink(
'host=localhost port=5432 dbname=novelservice user=postgres password=yourpassword',
'SELECT "Id", "CreatedTime", "LastUpdatedTime" FROM "Novels"'
) AS t("Id" bigint, "CreatedTime" timestamp with time zone, "LastUpdatedTime" timestamp with time zone)
ON CONFLICT ("Id") DO UPDATE SET
"LastUpdatedTime" = EXCLUDED."LastUpdatedTime";
*/
-- Option 3: Paste generated INSERT statements from extraction script here
-- INSERT INTO "Novels" ("Id", "CreatedTime", "LastUpdatedTime") VALUES (...) ON CONFLICT ("Id") DO NOTHING;

View File

@@ -0,0 +1,34 @@
-- Insert Volumes into UserNovelDataService database
-- Run this against: UserNovelDataService PostgreSQL database
--
-- PREREQUISITE:
-- 1. Ensure the Volumes table exists (run EF migrations first if needed)
-- 2. Novels must be inserted first (FK constraint)
-- 3. Extract volumes from NovelService using 03_extract_volumes_from_novelservice.sql
-- Option 1: If you have a CSV file from export
-- \copy "Volumes" ("Id", "NovelId", "CreatedTime", "LastUpdatedTime") FROM '/tmp/volumes_export.csv' WITH CSV HEADER;
-- Option 2: Direct cross-database insert using dblink
-- First, install dblink extension if not already done:
-- CREATE EXTENSION IF NOT EXISTS dblink;
-- Example using dblink (adjust connection string):
/*
INSERT INTO "Volumes" ("Id", "NovelId", "CreatedTime", "LastUpdatedTime")
SELECT
"Id"::bigint,
"NovelId"::bigint,
"CreatedTime"::timestamp with time zone,
"LastUpdatedTime"::timestamp with time zone
FROM dblink(
'host=localhost port=5432 dbname=novelservice user=postgres password=yourpassword',
'SELECT "Id", "NovelId", "CreatedTime", "LastUpdatedTime" FROM "Volume"'
) AS t("Id" bigint, "NovelId" bigint, "CreatedTime" timestamp with time zone, "LastUpdatedTime" timestamp with time zone)
ON CONFLICT ("Id") DO UPDATE SET
"NovelId" = EXCLUDED."NovelId",
"LastUpdatedTime" = EXCLUDED."LastUpdatedTime";
*/
-- Option 3: Paste generated INSERT statements from extraction script here
-- INSERT INTO "Volumes" ("Id", "NovelId", "CreatedTime", "LastUpdatedTime") VALUES (...) ON CONFLICT ("Id") DO NOTHING;

View File

@@ -0,0 +1,34 @@
-- Insert Chapters into UserNovelDataService database
-- Run this against: UserNovelDataService PostgreSQL database
--
-- PREREQUISITE:
-- 1. Ensure the Chapters table exists (run EF migrations first if needed)
-- 2. Volumes must be inserted first (FK constraint)
-- 3. Extract chapters from NovelService using 04_extract_chapters_from_novelservice.sql
-- Option 1: If you have a CSV file from export
-- \copy "Chapters" ("Id", "VolumeId", "CreatedTime", "LastUpdatedTime") FROM '/tmp/chapters_export.csv' WITH CSV HEADER;
-- Option 2: Direct cross-database insert using dblink
-- First, install dblink extension if not already done:
-- CREATE EXTENSION IF NOT EXISTS dblink;
-- Example using dblink (adjust connection string):
/*
INSERT INTO "Chapters" ("Id", "VolumeId", "CreatedTime", "LastUpdatedTime")
SELECT
"Id"::bigint,
"VolumeId"::bigint,
"CreatedTime"::timestamp with time zone,
"LastUpdatedTime"::timestamp with time zone
FROM dblink(
'host=localhost port=5432 dbname=novelservice user=postgres password=yourpassword',
'SELECT "Id", "VolumeId", "CreatedTime", "LastUpdatedTime" FROM "Chapter"'
) AS t("Id" bigint, "VolumeId" bigint, "CreatedTime" timestamp with time zone, "LastUpdatedTime" timestamp with time zone)
ON CONFLICT ("Id") DO UPDATE SET
"VolumeId" = EXCLUDED."VolumeId",
"LastUpdatedTime" = EXCLUDED."LastUpdatedTime";
*/
-- Option 3: Paste generated INSERT statements from extraction script here
-- INSERT INTO "Chapters" ("Id", "VolumeId", "CreatedTime", "LastUpdatedTime") VALUES (...) ON CONFLICT ("Id") DO NOTHING;

View File

@@ -0,0 +1,53 @@
using FictionArchive.Service.Shared.Services.EventBus;
using FictionArchive.Service.UserNovelDataService.Models.Database;
using FictionArchive.Service.UserNovelDataService.Models.IntegrationEvents;
using Microsoft.EntityFrameworkCore;
namespace FictionArchive.Service.UserNovelDataService.Services.EventHandlers;
public class ChapterCreatedEventHandler : IIntegrationEventHandler<ChapterCreatedEvent>
{
private readonly UserNovelDataServiceDbContext _dbContext;
private readonly ILogger<ChapterCreatedEventHandler> _logger;
public ChapterCreatedEventHandler(
UserNovelDataServiceDbContext dbContext,
ILogger<ChapterCreatedEventHandler> logger)
{
_dbContext = dbContext;
_logger = logger;
}
public async Task Handle(ChapterCreatedEvent @event)
{
// Ensure novel exists
var novelExists = await _dbContext.Novels.AnyAsync(n => n.Id == @event.NovelId);
if (!novelExists)
{
var novel = new Novel { Id = @event.NovelId };
_dbContext.Novels.Add(novel);
}
// Ensure volume exists
var volumeExists = await _dbContext.Volumes.AnyAsync(v => v.Id == @event.VolumeId);
if (!volumeExists)
{
var volume = new Volume { Id = @event.VolumeId };
_dbContext.Volumes.Add(volume);
}
// Create chapter if not exists
var chapterExists = await _dbContext.Chapters.AnyAsync(c => c.Id == @event.ChapterId);
if (chapterExists)
{
_logger.LogDebug("Chapter {ChapterId} already exists, skipping", @event.ChapterId);
return;
}
var chapter = new Chapter { Id = @event.ChapterId };
_dbContext.Chapters.Add(chapter);
await _dbContext.SaveChangesAsync();
_logger.LogInformation("Created chapter stub for {ChapterId} in novel {NovelId}", @event.ChapterId, @event.NovelId);
}
}

View File

@@ -0,0 +1,36 @@
using FictionArchive.Service.Shared.Services.EventBus;
using FictionArchive.Service.UserNovelDataService.Models.Database;
using FictionArchive.Service.UserNovelDataService.Models.IntegrationEvents;
using Microsoft.EntityFrameworkCore;
namespace FictionArchive.Service.UserNovelDataService.Services.EventHandlers;
public class NovelCreatedEventHandler : IIntegrationEventHandler<NovelCreatedEvent>
{
private readonly UserNovelDataServiceDbContext _dbContext;
private readonly ILogger<NovelCreatedEventHandler> _logger;
public NovelCreatedEventHandler(
UserNovelDataServiceDbContext dbContext,
ILogger<NovelCreatedEventHandler> logger)
{
_dbContext = dbContext;
_logger = logger;
}
public async Task Handle(NovelCreatedEvent @event)
{
var exists = await _dbContext.Novels.AnyAsync(n => n.Id == @event.NovelId);
if (exists)
{
_logger.LogDebug("Novel {NovelId} already exists, skipping", @event.NovelId);
return;
}
var novel = new Novel { Id = @event.NovelId };
_dbContext.Novels.Add(novel);
await _dbContext.SaveChangesAsync();
_logger.LogInformation("Created novel stub for {NovelId}", @event.NovelId);
}
}

View File

@@ -0,0 +1,40 @@
using FictionArchive.Service.Shared.Services.EventBus;
using FictionArchive.Service.UserNovelDataService.Models.Database;
using FictionArchive.Service.UserNovelDataService.Models.IntegrationEvents;
using Microsoft.EntityFrameworkCore;
namespace FictionArchive.Service.UserNovelDataService.Services.EventHandlers;
public class UserInvitedEventHandler : IIntegrationEventHandler<UserInvitedEvent>
{
private readonly UserNovelDataServiceDbContext _dbContext;
private readonly ILogger<UserInvitedEventHandler> _logger;
public UserInvitedEventHandler(
UserNovelDataServiceDbContext dbContext,
ILogger<UserInvitedEventHandler> logger)
{
_dbContext = dbContext;
_logger = logger;
}
public async Task Handle(UserInvitedEvent @event)
{
var exists = await _dbContext.Users.AnyAsync(u => u.Id == @event.InvitedUserId);
if (exists)
{
_logger.LogDebug("User {UserId} already exists, skipping", @event.InvitedUserId);
return;
}
var user = new User
{
Id = @event.InvitedUserId,
OAuthProviderId = @event.InvitedOAuthProviderId
};
_dbContext.Users.Add(user);
await _dbContext.SaveChangesAsync();
_logger.LogInformation("Created user stub for {UserId}", @event.InvitedUserId);
}
}

View File

@@ -0,0 +1,38 @@
using FictionArchive.Service.Shared.Services.Database;
using FictionArchive.Service.UserNovelDataService.Models.Database;
using Microsoft.EntityFrameworkCore;
namespace FictionArchive.Service.UserNovelDataService.Services;
public class UserNovelDataServiceDbContext : FictionArchiveDbContext
{
public DbSet<User> Users { get; set; }
public DbSet<Bookmark> Bookmarks { get; set; }
public DbSet<Novel> Novels { get; set; }
public DbSet<Volume> Volumes { get; set; }
public DbSet<Chapter> Chapters { get; set; }
public UserNovelDataServiceDbContext(DbContextOptions options, ILogger<UserNovelDataServiceDbContext> logger) : base(options, logger)
{
}
protected override void OnModelCreating(ModelBuilder modelBuilder)
{
base.OnModelCreating(modelBuilder);
modelBuilder.Entity<Bookmark>(entity =>
{
// Unique constraint: one bookmark per chapter per user
entity.HasIndex(b => new { b.UserId, b.ChapterId }).IsUnique();
// Index for efficient "get bookmarks for novel" queries
entity.HasIndex(b => new { b.UserId, b.NovelId });
// User relationship
entity.HasOne(b => b.User)
.WithMany()
.HasForeignKey(b => b.UserId)
.OnDelete(DeleteBehavior.Cascade);
});
}
}

View File

@@ -0,0 +1,8 @@
{
"Logging": {
"LogLevel": {
"Default": "Information",
"Microsoft.AspNetCore": "Warning"
}
}
}

View File

@@ -0,0 +1,26 @@
{
"Logging": {
"LogLevel": {
"Default": "Information",
"Microsoft.AspNetCore": "Warning"
}
},
"ConnectionStrings": {
"DefaultConnection": "Host=localhost;Database=FictionArchive_UserNovelDataService;Username=postgres;password=postgres"
},
"RabbitMQ": {
"ConnectionString": "amqp://localhost",
"ClientIdentifier": "UserNovelDataService"
},
"OIDC": {
"Authority": "https://auth.orfl.xyz/application/o/fiction-archive/",
"ClientId": "ldi5IpEidq2WW0Ka1lehVskb2SOBjnYRaZCpEyBh",
"Audience": "ldi5IpEidq2WW0Ka1lehVskb2SOBjnYRaZCpEyBh",
"ValidIssuer": "https://auth.orfl.xyz/application/o/fiction-archive/",
"ValidateIssuer": true,
"ValidateAudience": true,
"ValidateLifetime": true,
"ValidateIssuerSigningKey": true
},
"AllowedHosts": "*"
}

View File

@@ -0,0 +1,6 @@
{
"subgraph": "UserNovelData",
"http": {
"baseAddress": "https://localhost:7298/graphql"
}
}

View File

@@ -0,0 +1,30 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net8.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
<IsPackable>false</IsPackable>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="FluentAssertions" Version="6.12.0" />
<PackageReference Include="Microsoft.EntityFrameworkCore.InMemory" Version="9.0.11" />
<PackageReference Include="Microsoft.NET.Test.Sdk" Version="17.11.1" />
<PackageReference Include="NSubstitute" Version="5.1.0" />
<PackageReference Include="xunit" Version="2.9.2" />
<PackageReference Include="xunit.runner.visualstudio" Version="2.8.2">
<PrivateAssets>all</PrivateAssets>
<IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
</PackageReference>
<PackageReference Include="coverlet.collector" Version="6.0.2">
<PrivateAssets>all</PrivateAssets>
<IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
</PackageReference>
</ItemGroup>
<ItemGroup>
<ProjectReference Include="..\FictionArchive.Service.UserService\FictionArchive.Service.UserService.csproj" />
</ItemGroup>
</Project>

View File

@@ -0,0 +1,329 @@
using FictionArchive.Service.Shared.Services.EventBus;
using FictionArchive.Service.UserService.Models.Database;
using FictionArchive.Service.UserService.Services;
using FictionArchive.Service.UserService.Services.AuthenticationClient;
using FictionArchive.Service.UserService.Services.AuthenticationClient.Authentik;
using FluentAssertions;
using Microsoft.EntityFrameworkCore;
using Microsoft.Extensions.Logging.Abstractions;
using NSubstitute;
using Xunit;
namespace FictionArchive.Service.UserService.Tests;
public class UserManagementServiceTests
{
#region Helper Methods
private static UserServiceDbContext CreateDbContext()
{
var options = new DbContextOptionsBuilder<UserServiceDbContext>()
.UseInMemoryDatabase($"UserManagementServiceTests-{Guid.NewGuid()}")
.Options;
return new UserServiceDbContext(options, NullLogger<UserServiceDbContext>.Instance);
}
private static UserManagementService CreateService(
UserServiceDbContext dbContext,
IAuthenticationServiceClient authClient,
IEventBus? eventBus = null)
{
return new UserManagementService(
dbContext,
NullLogger<UserManagementService>.Instance,
authClient,
eventBus ?? Substitute.For<IEventBus>());
}
private static User CreateTestUser(string username, string email, int availableInvites = 5)
{
return new User
{
Username = username,
Email = email,
OAuthProviderId = Guid.NewGuid().ToString(),
Disabled = false,
AvailableInvites = availableInvites
};
}
#endregion
#region InviteUserAsync Tests
[Fact]
public async Task InviteUserAsync_WithValidInviter_CreatesUserAndDecrementsInvites()
{
// Arrange
using var dbContext = CreateDbContext();
var inviter = CreateTestUser("inviter", "inviter@test.com", availableInvites: 3);
dbContext.Users.Add(inviter);
await dbContext.SaveChangesAsync();
var authClient = Substitute.For<IAuthenticationServiceClient>();
authClient.CreateUserAsync(Arg.Any<string>(), Arg.Any<string>(), Arg.Any<string>())
.Returns(new AuthentikUserResponse { Pk = 123, Uid = "authentik-uid-456" });
authClient.SendRecoveryEmailAsync(Arg.Any<int>()).Returns(true);
var service = CreateService(dbContext, authClient);
// Act
var result = await service.InviteUserAsync(inviter, "new@test.com", "newuser");
// Assert
result.Should().NotBeNull();
result!.Username.Should().Be("newuser");
result.Email.Should().Be("new@test.com");
result.InviterId.Should().Be(inviter.Id);
result.AvailableInvites.Should().Be(0);
inviter.AvailableInvites.Should().Be(2);
await authClient.Received(1).CreateUserAsync("newuser", "new@test.com", "newuser");
await authClient.Received(1).SendRecoveryEmailAsync(123);
}
[Fact]
public async Task InviteUserAsync_WithNoAvailableInvites_ReturnsNull()
{
// Arrange
using var dbContext = CreateDbContext();
var inviter = CreateTestUser("inviter", "inviter@test.com", availableInvites: 0);
dbContext.Users.Add(inviter);
await dbContext.SaveChangesAsync();
var authClient = Substitute.For<IAuthenticationServiceClient>();
var service = CreateService(dbContext, authClient);
// Act
var result = await service.InviteUserAsync(inviter, "new@test.com", "newuser");
// Assert
result.Should().BeNull();
await authClient.DidNotReceive().CreateUserAsync(Arg.Any<string>(), Arg.Any<string>(), Arg.Any<string>());
}
[Fact]
public async Task InviteUserAsync_WithDuplicateEmail_ReturnsNull()
{
// Arrange
using var dbContext = CreateDbContext();
var existingUser = CreateTestUser("existing", "existing@test.com");
var inviter = CreateTestUser("inviter", "inviter@test.com", availableInvites: 3);
dbContext.Users.AddRange(existingUser, inviter);
await dbContext.SaveChangesAsync();
var authClient = Substitute.For<IAuthenticationServiceClient>();
var service = CreateService(dbContext, authClient);
// Act
var result = await service.InviteUserAsync(inviter, "existing@test.com", "newuser");
// Assert
result.Should().BeNull();
await authClient.DidNotReceive().CreateUserAsync(Arg.Any<string>(), Arg.Any<string>(), Arg.Any<string>());
inviter.AvailableInvites.Should().Be(3); // Not decremented
}
[Fact]
public async Task InviteUserAsync_WithDuplicateUsername_ReturnsNull()
{
// Arrange
using var dbContext = CreateDbContext();
var existingUser = CreateTestUser("existinguser", "existing@test.com");
var inviter = CreateTestUser("inviter", "inviter@test.com", availableInvites: 3);
dbContext.Users.AddRange(existingUser, inviter);
await dbContext.SaveChangesAsync();
var authClient = Substitute.For<IAuthenticationServiceClient>();
var service = CreateService(dbContext, authClient);
// Act
var result = await service.InviteUserAsync(inviter, "new@test.com", "existinguser");
// Assert
result.Should().BeNull();
await authClient.DidNotReceive().CreateUserAsync(Arg.Any<string>(), Arg.Any<string>(), Arg.Any<string>());
inviter.AvailableInvites.Should().Be(3); // Not decremented
}
[Fact]
public async Task InviteUserAsync_WhenAuthentikFails_ReturnsNull()
{
// Arrange
using var dbContext = CreateDbContext();
var inviter = CreateTestUser("inviter", "inviter@test.com", availableInvites: 3);
dbContext.Users.Add(inviter);
await dbContext.SaveChangesAsync();
var authClient = Substitute.For<IAuthenticationServiceClient>();
authClient.CreateUserAsync(Arg.Any<string>(), Arg.Any<string>(), Arg.Any<string>())
.Returns((AuthentikUserResponse?)null);
var service = CreateService(dbContext, authClient);
// Act
var result = await service.InviteUserAsync(inviter, "new@test.com", "newuser");
// Assert
result.Should().BeNull();
await authClient.DidNotReceive().SendRecoveryEmailAsync(Arg.Any<int>());
// Verify no user was added to the database
var usersInDb = await dbContext.Users.ToListAsync();
usersInDb.Should().HaveCount(1); // Only the inviter
inviter.AvailableInvites.Should().Be(3); // Not decremented
}
[Fact]
public async Task InviteUserAsync_WhenRecoveryEmailFails_StillCreatesUser()
{
// Arrange
using var dbContext = CreateDbContext();
var inviter = CreateTestUser("inviter", "inviter@test.com", availableInvites: 3);
dbContext.Users.Add(inviter);
await dbContext.SaveChangesAsync();
var authClient = Substitute.For<IAuthenticationServiceClient>();
authClient.CreateUserAsync(Arg.Any<string>(), Arg.Any<string>(), Arg.Any<string>())
.Returns(new AuthentikUserResponse { Pk = 123, Uid = "authentik-uid-456" });
authClient.SendRecoveryEmailAsync(Arg.Any<int>()).Returns(false); // Email fails
var service = CreateService(dbContext, authClient);
// Act
var result = await service.InviteUserAsync(inviter, "new@test.com", "newuser");
// Assert - User should still be created despite email failure
result.Should().NotBeNull();
result!.Username.Should().Be("newuser");
inviter.AvailableInvites.Should().Be(2);
// Verify user was added to database
var usersInDb = await dbContext.Users.ToListAsync();
usersInDb.Should().HaveCount(2);
}
[Fact]
public async Task InviteUserAsync_SetsCorrectUserProperties()
{
// Arrange
using var dbContext = CreateDbContext();
var inviter = CreateTestUser("inviter", "inviter@test.com", availableInvites: 5);
dbContext.Users.Add(inviter);
await dbContext.SaveChangesAsync();
var authentikPk = 456;
var authClient = Substitute.For<IAuthenticationServiceClient>();
authClient.CreateUserAsync(Arg.Any<string>(), Arg.Any<string>(), Arg.Any<string>())
.Returns(new AuthentikUserResponse { Pk = authentikPk, Uid = "authentik-uid-789" });
authClient.SendRecoveryEmailAsync(Arg.Any<int>()).Returns(true);
var service = CreateService(dbContext, authClient);
// Act
var result = await service.InviteUserAsync(inviter, "newuser@test.com", "newusername");
// Assert
result.Should().NotBeNull();
result!.Username.Should().Be("newusername");
result.Email.Should().Be("newuser@test.com");
result.OAuthProviderId.Should().Be(authentikPk.ToString());
result.InviterId.Should().Be(inviter.Id);
result.AvailableInvites.Should().Be(0);
result.Disabled.Should().BeFalse();
result.Id.Should().NotBeEmpty();
}
#endregion
#region GetUserByOAuthProviderIdAsync Tests
[Fact]
public async Task GetUserByOAuthProviderIdAsync_WithExistingUser_ReturnsUser()
{
// Arrange
using var dbContext = CreateDbContext();
var oAuthProviderId = "oauth-provider-123";
var user = new User
{
Username = "testuser",
Email = "test@test.com",
OAuthProviderId = oAuthProviderId,
Disabled = false,
AvailableInvites = 5
};
dbContext.Users.Add(user);
await dbContext.SaveChangesAsync();
var authClient = Substitute.For<IAuthenticationServiceClient>();
var service = CreateService(dbContext, authClient);
// Act
var result = await service.GetUserByOAuthProviderIdAsync(oAuthProviderId);
// Assert
result.Should().NotBeNull();
result!.Id.Should().Be(user.Id);
result.Username.Should().Be("testuser");
result.OAuthProviderId.Should().Be(oAuthProviderId);
}
[Fact]
public async Task GetUserByOAuthProviderIdAsync_WithNonExistingUser_ReturnsNull()
{
// Arrange
using var dbContext = CreateDbContext();
var authClient = Substitute.For<IAuthenticationServiceClient>();
var service = CreateService(dbContext, authClient);
// Act
var result = await service.GetUserByOAuthProviderIdAsync("non-existing-id");
// Assert
result.Should().BeNull();
}
#endregion
#region GetUsers Tests
[Fact]
public async Task GetUsers_ReturnsAllUsers()
{
// Arrange
using var dbContext = CreateDbContext();
var user1 = CreateTestUser("user1", "user1@test.com");
var user2 = CreateTestUser("user2", "user2@test.com");
var user3 = CreateTestUser("user3", "user3@test.com");
dbContext.Users.AddRange(user1, user2, user3);
await dbContext.SaveChangesAsync();
var authClient = Substitute.For<IAuthenticationServiceClient>();
var service = CreateService(dbContext, authClient);
// Act
var result = await service.GetUsers().ToListAsync();
// Assert
result.Should().HaveCount(3);
result.Select(u => u.Username).Should().BeEquivalentTo(new[] { "user1", "user2", "user3" });
}
[Fact]
public async Task GetUsers_WithEmptyDb_ReturnsEmptyQueryable()
{
// Arrange
using var dbContext = CreateDbContext();
var authClient = Substitute.For<IAuthenticationServiceClient>();
var service = CreateService(dbContext, authClient);
// Act
var result = await service.GetUsers().ToListAsync();
// Assert
result.Should().BeEmpty();
}
#endregion
}

View File

@@ -22,6 +22,12 @@
<PrivateAssets>all</PrivateAssets>
<IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
</PackageReference>
<PackageReference Include="Newtonsoft.Json" Version="13.0.3" />
</ItemGroup>
<ItemGroup>
<Folder Include="Models\IntegrationEvents\" />
<Folder Include="Services\EventHandlers\" />
</ItemGroup>
</Project>

View File

@@ -1,38 +1,53 @@
using FictionArchive.Service.Shared.Constants;
using System.Security.Claims;
using FictionArchive.Service.UserService.Models.DTOs;
using FictionArchive.Service.UserService.Services;
using HotChocolate.Authorization;
using HotChocolate.Types;
namespace FictionArchive.Service.UserService.GraphQL;
public class Mutation
{
[Authorize(Roles = [AuthorizationConstants.Roles.Admin])]
public async Task<UserDto> RegisterUser(string username, string email, string oAuthProviderId,
string? inviterOAuthProviderId, UserManagementService userManagementService)
[Authorize]
[Error<InvalidOperationException>]
public async Task<UserDto> InviteUser(
string email,
string username,
UserManagementService userManagementService,
ClaimsPrincipal claimsPrincipal)
{
var user = await userManagementService.RegisterUser(username, email, oAuthProviderId, inviterOAuthProviderId);
// Get the current user's OAuth provider ID from claims
var oAuthProviderId = claimsPrincipal.FindFirst(ClaimTypes.NameIdentifier)?.Value;
if (string.IsNullOrEmpty(oAuthProviderId))
{
throw new InvalidOperationException("Unable to determine current user identity");
}
// Get the inviter from the database
var inviter = await userManagementService.GetUserByOAuthProviderIdAsync(oAuthProviderId);
if (inviter == null)
{
throw new InvalidOperationException("Current user not found in the system");
}
// Invite the new user
var newUser = await userManagementService.InviteUserAsync(inviter, email, username);
if (newUser == null)
{
throw new InvalidOperationException(
"Failed to invite user. Either you have no available invites, or the email/username is already in use.");
}
return new UserDto
{
Id = user.Id,
CreatedTime = user.CreatedTime,
LastUpdatedTime = user.LastUpdatedTime,
Username = user.Username,
Email = user.Email,
Disabled = user.Disabled,
Inviter = user.Inviter != null
? new UserDto
{
Id = user.Inviter.Id,
CreatedTime = user.Inviter.CreatedTime,
LastUpdatedTime = user.Inviter.LastUpdatedTime,
Username = user.Inviter.Username,
Email = user.Inviter.Email,
Disabled = user.Inviter.Disabled,
Inviter = null // Limit recursion to one level
}
: null
Id = newUser.Id,
CreatedTime = newUser.CreatedTime,
LastUpdatedTime = newUser.LastUpdatedTime,
Username = newUser.Username,
Email = newUser.Email,
Disabled = newUser.Disabled,
AvailableInvites = newUser.AvailableInvites,
InviterId = newUser.InviterId
};
}
}

View File

@@ -1,34 +1,43 @@
using System.Security.Claims;
using FictionArchive.Service.UserService.Models.DTOs;
using FictionArchive.Service.UserService.Services;
using HotChocolate.Authorization;
using HotChocolate.Data;
namespace FictionArchive.Service.UserService.GraphQL;
public class Query
{
[Authorize]
public IQueryable<UserDto> GetUsers(UserManagementService userManagementService)
[UseProjection]
[UseFirstOrDefault]
public IQueryable<UserDto> GetCurrentUser(
UserServiceDbContext dbContext,
ClaimsPrincipal claimsPrincipal)
{
return userManagementService.GetUsers().Select(user => new UserDto
var oAuthProviderId = claimsPrincipal.FindFirst(ClaimTypes.NameIdentifier)?.Value;
if (string.IsNullOrEmpty(oAuthProviderId))
{
Id = user.Id,
CreatedTime = user.CreatedTime,
LastUpdatedTime = user.LastUpdatedTime,
Username = user.Username,
Email = user.Email,
Disabled = user.Disabled,
Inviter = user.Inviter != null
? new UserDto
{
Id = user.Inviter.Id,
CreatedTime = user.Inviter.CreatedTime,
LastUpdatedTime = user.Inviter.LastUpdatedTime,
Username = user.Inviter.Username,
Email = user.Inviter.Email,
Disabled = user.Inviter.Disabled,
Inviter = null // Limit recursion to one level
return Enumerable.Empty<UserDto>().AsQueryable();
}
: null
return dbContext.Users
.Where(u => u.OAuthProviderId == oAuthProviderId)
.Select(u => new UserDto
{
Id = u.Id,
CreatedTime = u.CreatedTime,
LastUpdatedTime = u.LastUpdatedTime,
Username = u.Username,
Email = u.Email,
Disabled = u.Disabled,
AvailableInvites = u.AvailableInvites,
InviterId = u.InviterId,
InvitedUsers = u.InvitedUsers.Select(iu => new InvitedUserDto
{
Username = iu.Username,
Email = iu.Email
}).ToList()
});
}
}

View File

@@ -0,0 +1,83 @@
// <auto-generated />
using System;
using FictionArchive.Service.UserService.Services;
using Microsoft.EntityFrameworkCore;
using Microsoft.EntityFrameworkCore.Infrastructure;
using Microsoft.EntityFrameworkCore.Migrations;
using Microsoft.EntityFrameworkCore.Storage.ValueConversion;
using NodaTime;
using Npgsql.EntityFrameworkCore.PostgreSQL.Metadata;
#nullable disable
namespace FictionArchive.Service.UserService.Migrations
{
[DbContext(typeof(UserServiceDbContext))]
[Migration("20251229151921_AddAvailableInvites")]
partial class AddAvailableInvites
{
/// <inheritdoc />
protected override void BuildTargetModel(ModelBuilder modelBuilder)
{
#pragma warning disable 612, 618
modelBuilder
.HasAnnotation("ProductVersion", "9.0.11")
.HasAnnotation("Relational:MaxIdentifierLength", 63);
NpgsqlModelBuilderExtensions.UseIdentityByDefaultColumns(modelBuilder);
modelBuilder.Entity("FictionArchive.Service.UserService.Models.Database.User", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uuid");
b.Property<int>("AvailableInvites")
.HasColumnType("integer");
b.Property<Instant>("CreatedTime")
.HasColumnType("timestamp with time zone");
b.Property<bool>("Disabled")
.HasColumnType("boolean");
b.Property<string>("Email")
.IsRequired()
.HasColumnType("text");
b.Property<Guid?>("InviterId")
.HasColumnType("uuid");
b.Property<Instant>("LastUpdatedTime")
.HasColumnType("timestamp with time zone");
b.Property<string>("OAuthProviderId")
.IsRequired()
.HasColumnType("text");
b.Property<string>("Username")
.IsRequired()
.HasColumnType("text");
b.HasKey("Id");
b.HasIndex("InviterId");
b.HasIndex("OAuthProviderId")
.IsUnique();
b.ToTable("Users");
});
modelBuilder.Entity("FictionArchive.Service.UserService.Models.Database.User", b =>
{
b.HasOne("FictionArchive.Service.UserService.Models.Database.User", "Inviter")
.WithMany()
.HasForeignKey("InviterId");
b.Navigation("Inviter");
});
#pragma warning restore 612, 618
}
}
}

View File

@@ -0,0 +1,29 @@
using Microsoft.EntityFrameworkCore.Migrations;
#nullable disable
namespace FictionArchive.Service.UserService.Migrations
{
/// <inheritdoc />
public partial class AddAvailableInvites : Migration
{
/// <inheritdoc />
protected override void Up(MigrationBuilder migrationBuilder)
{
migrationBuilder.AddColumn<int>(
name: "AvailableInvites",
table: "Users",
type: "integer",
nullable: false,
defaultValue: 0);
}
/// <inheritdoc />
protected override void Down(MigrationBuilder migrationBuilder)
{
migrationBuilder.DropColumn(
name: "AvailableInvites",
table: "Users");
}
}
}

View File

@@ -29,6 +29,9 @@ namespace FictionArchive.Service.UserService.Migrations
.ValueGeneratedOnAdd()
.HasColumnType("uuid");
b.Property<int>("AvailableInvites")
.HasColumnType("integer");
b.Property<Instant>("CreatedTime")
.HasColumnType("timestamp with time zone");

View File

@@ -0,0 +1,7 @@
namespace FictionArchive.Service.UserService.Models.DTOs;
public class InvitedUserDto
{
public required string Username { get; init; }
public required string Email { get; init; }
}

View File

@@ -7,9 +7,11 @@ public class UserDto
public Guid Id { get; init; }
public Instant CreatedTime { get; init; }
public Instant LastUpdatedTime { get; init; }
public required string Username { get; init; }
public required string Email { get; init; }
// OAuthProviderId intentionally omitted for security
public bool Disabled { get; init; }
public UserDto? Inviter { get; init; }
public int AvailableInvites { get; init; }
public Guid? InviterId { get; init; }
public List<InvitedUserDto>? InvitedUsers { get; init; }
}

View File

@@ -6,15 +6,14 @@ namespace FictionArchive.Service.UserService.Models.Database;
[Index(nameof(OAuthProviderId), IsUnique = true)]
public class User : BaseEntity<Guid>
{
public string Username { get; set; }
public string Email { get; set; }
public string OAuthProviderId { get; set; }
public required string Username { get; set; }
public required string Email { get; set; }
public required string OAuthProviderId { get; set; }
public bool Disabled { get; set; }
public int AvailableInvites { get; set; } = 0;
/// <summary>
/// The user that generated an invite used by this user.
/// </summary>
// Navigation properties
public Guid? InviterId { get; set; }
public User? Inviter { get; set; }
public ICollection<User> InvitedUsers { get; set; } = new List<User>();
}

View File

@@ -1,16 +0,0 @@
using FictionArchive.Service.Shared.Services.EventBus;
namespace FictionArchive.Service.UserService.Models.IntegrationEvents;
public class AuthUserAddedEvent : IIntegrationEvent
{
public string OAuthProviderId { get; set; }
public string InviterOAuthProviderId { get; set; }
// The email of the user that created the event
public string EventUserEmail { get; set; }
// The username of the user that created the event
public string EventUserUsername { get; set; }
}

View File

@@ -0,0 +1,17 @@
using FictionArchive.Service.Shared.Services.EventBus;
namespace FictionArchive.Service.UserService.Models.IntegrationEvents;
public class UserInvitedEvent : IIntegrationEvent
{
// Invited user info
public Guid InvitedUserId { get; set; }
public required string InvitedUsername { get; set; }
public required string InvitedEmail { get; set; }
public required string InvitedOAuthProviderId { get; set; }
// Inviter info
public Guid InviterId { get; set; }
public required string InviterUsername { get; set; }
public required string InviterOAuthProviderId { get; set; }
}

View File

@@ -1,11 +1,12 @@
using System.Net.Http.Headers;
using FictionArchive.Common.Extensions;
using FictionArchive.Service.Shared;
using FictionArchive.Service.Shared.Extensions;
using FictionArchive.Service.Shared.Services.EventBus.Implementations;
using FictionArchive.Service.UserService.GraphQL;
using FictionArchive.Service.UserService.Models.IntegrationEvents;
using FictionArchive.Service.UserService.Services;
using FictionArchive.Service.UserService.Services.EventHandlers;
using FictionArchive.Service.UserService.Services.AuthenticationClient;
using FictionArchive.Service.UserService.Services.AuthenticationClient.Authentik;
namespace FictionArchive.Service.UserService;
@@ -25,8 +26,7 @@ public class Program
builder.Services.AddRabbitMQ(opt =>
{
builder.Configuration.GetSection("RabbitMQ").Bind(opt);
})
.Subscribe<AuthUserAddedEvent, AuthUserAddedEventHandler>();
});
}
#endregion
@@ -38,6 +38,22 @@ public class Program
#endregion
#region Authentik Client
builder.Services.Configure<AuthentikConfiguration>(
builder.Configuration.GetSection("Authentik"));
var authentikConfig = builder.Configuration.GetSection("Authentik").Get<AuthentikConfiguration>();
builder.Services.AddHttpClient<IAuthenticationServiceClient, AuthentikClient>(client =>
{
client.BaseAddress = new Uri(authentikConfig?.BaseUrl ?? "https://localhost");
client.DefaultRequestHeaders.Authorization =
new AuthenticationHeaderValue("Bearer", authentikConfig?.ApiToken ?? "");
})
.AddStandardResilienceHandler();
#endregion
builder.Services.RegisterDbContext<UserServiceDbContext>(
builder.Configuration.GetConnectionString("DefaultConnection"),
skipInfrastructure: isSchemaExport);

View File

@@ -0,0 +1,21 @@
using Newtonsoft.Json;
namespace FictionArchive.Service.UserService.Services.AuthenticationClient.Authentik;
public class AuthentikAddUserRequest
{
[JsonProperty("username")]
public required string Username { get; set; }
[JsonProperty("name")]
public required string DisplayName { get; set; }
[JsonProperty("email")]
public required string Email { get; set; }
[JsonProperty("is_active")]
public bool IsActive { get; set; } = true;
[JsonProperty("type")]
public string Type { get; } = "external";
}

View File

@@ -0,0 +1,88 @@
using System.Text;
using Microsoft.Extensions.Options;
using Newtonsoft.Json;
namespace FictionArchive.Service.UserService.Services.AuthenticationClient.Authentik;
public class AuthentikClient : IAuthenticationServiceClient
{
private readonly HttpClient _httpClient;
private readonly ILogger<AuthentikClient> _logger;
private readonly AuthentikConfiguration _configuration;
public AuthentikClient(
HttpClient httpClient,
ILogger<AuthentikClient> logger,
IOptions<AuthentikConfiguration> configuration)
{
_httpClient = httpClient;
_logger = logger;
_configuration = configuration.Value;
}
public async Task<AuthentikUserResponse?> CreateUserAsync(string username, string email, string displayName)
{
var request = new AuthentikAddUserRequest
{
Username = username,
Email = email,
DisplayName = displayName,
IsActive = true
};
try
{
var json = JsonConvert.SerializeObject(request);
var content = new StringContent(json, Encoding.UTF8, "application/json");
var response = await _httpClient.PostAsync("/api/v3/core/users/", content);
if (!response.IsSuccessStatusCode)
{
var errorContent = await response.Content.ReadAsStringAsync();
_logger.LogError(
"Failed to create user in Authentik. Status: {StatusCode}, Error: {Error}",
response.StatusCode, errorContent);
return null;
}
var responseJson = await response.Content.ReadAsStringAsync();
var userResponse = JsonConvert.DeserializeObject<AuthentikUserResponse>(responseJson);
_logger.LogInformation("Successfully created user {Username} in Authentik with pk {Pk}",
username, userResponse?.Pk);
return userResponse;
}
catch (Exception ex)
{
_logger.LogError(ex, "Exception while creating user {Username} in Authentik", username);
return null;
}
}
public async Task<bool> SendRecoveryEmailAsync(int authentikUserId)
{
try
{
var response = await _httpClient.PostAsync(
$"/api/v3/core/users/{authentikUserId}/recovery_email/?email_stage={_configuration.EmailStageId}",
null);
if (!response.IsSuccessStatusCode)
{
var errorContent = await response.Content.ReadAsStringAsync();
_logger.LogError(
"Failed to send recovery email for user {UserId}. Status: {StatusCode}, Error: {Error}",
authentikUserId, response.StatusCode, errorContent);
return false;
}
_logger.LogInformation("Successfully sent recovery email to Authentik user {UserId}", authentikUserId);
return true;
}
catch (Exception ex)
{
_logger.LogError(ex, "Exception while sending recovery email to Authentik user {UserId}", authentikUserId);
return false;
}
}
}

View File

@@ -0,0 +1,8 @@
namespace FictionArchive.Service.UserService.Services.AuthenticationClient.Authentik;
public class AuthentikConfiguration
{
public string BaseUrl { get; set; } = string.Empty;
public string ApiToken { get; set; } = string.Empty;
public string EmailStageId { get; set; }
}

View File

@@ -0,0 +1,27 @@
using Newtonsoft.Json;
namespace FictionArchive.Service.UserService.Services.AuthenticationClient.Authentik;
public class AuthentikUserResponse
{
[JsonProperty("pk")]
public int Pk { get; set; }
[JsonProperty("username")]
public string Username { get; set; } = string.Empty;
[JsonProperty("name")]
public string Name { get; set; } = string.Empty;
[JsonProperty("email")]
public string Email { get; set; } = string.Empty;
[JsonProperty("is_active")]
public bool IsActive { get; set; }
[JsonProperty("is_superuser")]
public bool IsSuperuser { get; set; }
[JsonProperty("uid")]
public string Uid { get; set; } = string.Empty;
}

View File

@@ -0,0 +1,22 @@
using FictionArchive.Service.UserService.Services.AuthenticationClient.Authentik;
namespace FictionArchive.Service.UserService.Services.AuthenticationClient;
public interface IAuthenticationServiceClient
{
/// <summary>
/// Creates a new user in the authentication provider.
/// </summary>
/// <param name="username">The username for the new user</param>
/// <param name="email">The email address for the new user</param>
/// <param name="displayName">The display name for the new user</param>
/// <returns>The created user response, or null if creation failed</returns>
Task<AuthentikUserResponse?> CreateUserAsync(string username, string email, string displayName);
/// <summary>
/// Sends a password recovery email to the user.
/// </summary>
/// <param name="authentikUserId">The Authentik user ID (pk)</param>
/// <returns>True if the email was sent successfully, false otherwise</returns>
Task<bool> SendRecoveryEmailAsync(int authentikUserId);
}

View File

@@ -1,23 +0,0 @@
using FictionArchive.Service.Shared.Services.EventBus;
using FictionArchive.Service.UserService.Models.IntegrationEvents;
using FictionArchive.Service.UserService.Models.Database;
using Microsoft.EntityFrameworkCore; // Add this line to include the UserModel
namespace FictionArchive.Service.UserService.Services.EventHandlers;
public class AuthUserAddedEventHandler : IIntegrationEventHandler<AuthUserAddedEvent>
{
private readonly UserManagementService _userManagementService;
private readonly ILogger<AuthUserAddedEventHandler> _logger;
public AuthUserAddedEventHandler(UserServiceDbContext dbContext, ILogger<AuthUserAddedEventHandler> logger, UserManagementService userManagementService)
{
_logger = logger;
_userManagementService = userManagementService;
}
public async Task Handle(AuthUserAddedEvent @event)
{
await _userManagementService.RegisterUser(@event.EventUserUsername, @event.EventUserEmail, @event.OAuthProviderId, @event.InviterOAuthProviderId);
}
}

View File

@@ -1,4 +1,7 @@
using FictionArchive.Service.Shared.Services.EventBus;
using FictionArchive.Service.UserService.Models.Database;
using FictionArchive.Service.UserService.Models.IntegrationEvents;
using FictionArchive.Service.UserService.Services.AuthenticationClient;
using Microsoft.EntityFrameworkCore;
namespace FictionArchive.Service.UserService.Services;
@@ -7,39 +10,142 @@ public class UserManagementService
{
private readonly ILogger<UserManagementService> _logger;
private readonly UserServiceDbContext _dbContext;
private readonly IAuthenticationServiceClient _authClient;
private readonly IEventBus _eventBus;
public UserManagementService(UserServiceDbContext dbContext, ILogger<UserManagementService> logger)
public UserManagementService(
UserServiceDbContext dbContext,
ILogger<UserManagementService> logger,
IAuthenticationServiceClient authClient,
IEventBus eventBus)
{
_dbContext = dbContext;
_logger = logger;
_authClient = authClient;
_eventBus = eventBus;
}
public async Task<User> RegisterUser(string username, string email, string oAuthProviderId,
string? inviterOAuthProviderId)
/// <summary>
/// Invites a new user by creating them in Authentik, saving to the database, and sending a recovery email.
/// </summary>
/// <param name="inviter">The user sending the invite</param>
/// <param name="email">Email address of the invitee</param>
/// <param name="username">Username for the invitee</param>
/// <returns>The created user, or null if the invite failed</returns>
public async Task<User?> InviteUserAsync(User inviter, string email, string username)
{
var newUser = new User();
User? inviter =
await _dbContext.Users.FirstOrDefaultAsync(user => user.OAuthProviderId == inviterOAuthProviderId);
if (inviter == null && inviterOAuthProviderId != null)
// Check if inviter has available invites
if (inviter.AvailableInvites <= 0)
{
_logger.LogCritical(
"A user with OAuthProviderId {OAuthProviderId} was marked as having inviter with OAuthProviderId {inviterOAuthProviderId}, but no user was found with that value.",
inviterOAuthProviderId, inviterOAuthProviderId);
newUser.Disabled = true;
_logger.LogWarning("User {InviterId} has no available invites", inviter.Id);
return null;
}
newUser.Username = username;
newUser.Email = email;
newUser.OAuthProviderId = oAuthProviderId;
// Check if email is already in use
var existingUser = await _dbContext.Users
.AsQueryable()
.FirstOrDefaultAsync(u => u.Email == email);
_dbContext.Users.Add(newUser); // Add the new user to the DbContext
await _dbContext.SaveChangesAsync(); // Save changes to the database
if (existingUser != null)
{
_logger.LogWarning("Email {Email} is already in use", email);
return null;
}
// Check if username is already in use
var existingUsername = await _dbContext.Users
.AsQueryable()
.FirstOrDefaultAsync(u => u.Username == username);
if (existingUsername != null)
{
_logger.LogWarning("Username {Username} is already in use", username);
return null;
}
// Create user in Authentik
var authentikUser = await _authClient.CreateUserAsync(username, email, username);
if (authentikUser == null)
{
_logger.LogError("Failed to create user {Username} in Authentik", username);
return null;
}
// Send recovery email via Authentik
var emailSent = await _authClient.SendRecoveryEmailAsync(authentikUser.Pk);
if (!emailSent)
{
_logger.LogWarning(
"User {Username} was created in Authentik but recovery email failed to send. Authentik pk: {Pk}",
username, authentikUser.Pk);
// Continue anyway - the user is created, admin can resend email manually
}
// Create user in local database
var newUser = new User
{
Username = username,
Email = email,
OAuthProviderId = authentikUser.Pk.ToString(),
Disabled = false,
AvailableInvites = 0,
InviterId = inviter.Id
};
_dbContext.Users.Add(newUser);
// Decrement inviter's available invites
inviter.AvailableInvites--;
await _dbContext.SaveChangesAsync();
await _eventBus.Publish(new UserInvitedEvent
{
InvitedUserId = newUser.Id,
InvitedUsername = newUser.Username,
InvitedEmail = newUser.Email,
InvitedOAuthProviderId = newUser.OAuthProviderId,
InviterId = inviter.Id,
InviterUsername = inviter.Username,
InviterOAuthProviderId = inviter.OAuthProviderId
});
_logger.LogInformation(
"User {Username} was successfully invited by {InviterId}. New user id: {NewUserId}",
username, inviter.Id, newUser.Id);
return newUser;
}
/// <summary>
/// Gets a user by their OAuth provider ID (Authentik UID).
/// </summary>
public async Task<User?> GetUserByOAuthProviderIdAsync(string oAuthProviderId)
{
return await _dbContext.Users
.AsQueryable()
.FirstOrDefaultAsync(u => u.OAuthProviderId == oAuthProviderId);
}
/// <summary>
/// Gets all users as a queryable for GraphQL.
/// </summary>
public IQueryable<User> GetUsers()
{
return _dbContext.Users.AsQueryable();
}
/// <summary>
/// Gets all users invited by a specific user.
/// </summary>
/// <param name="inviterId">The ID of the user who sent the invites</param>
/// <returns>List of users invited by the specified user</returns>
public async Task<List<User>> GetInvitedByUserAsync(Guid inviterId)
{
return await _dbContext.Users
.AsQueryable()
.Where(u => u.InviterId == inviterId)
.OrderByDescending(u => u.CreatedTime)
.ToListAsync();
}
}

View File

@@ -12,6 +12,11 @@
"ConnectionString": "amqp://localhost",
"ClientIdentifier": "UserService"
},
"Authentik": {
"BaseUrl": "https://auth.orfl.xyz",
"ApiToken": "REPLACE_ME",
"EmailStageId": "10df0c18-8802-4ec7-852e-3cdd355514d3"
},
"AllowedHosts": "*",
"OIDC": {
"Authority": "https://auth.orfl.xyz/application/o/fiction-archive/",

View File

@@ -1,5 +1,6 @@

Microsoft Visual Studio Solution File, Format Version 12.00
#
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "FictionArchive.Common", "FictionArchive.Common\FictionArchive.Common.csproj", "{ABF1BA10-9E76-45BE-9947-E20445A68147}"
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "FictionArchive.API", "FictionArchive.API\FictionArchive.API.csproj", "{420CC1A1-9DBC-40EC-B9E3-D4B25D71B9A9}"
@@ -14,12 +15,14 @@ Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "FictionArchive.Service.Sche
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "FictionArchive.Service.UserService", "FictionArchive.Service.UserService\FictionArchive.Service.UserService.csproj", "{EE4D4795-2F79-4614-886D-AF8DA77120AC}"
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "FictionArchive.Service.AuthenticationService", "FictionArchive.Service.AuthenticationService\FictionArchive.Service.AuthenticationService.csproj", "{70C4AE82-B01E-421D-B590-C0F47E63CD0C}"
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "FictionArchive.Service.FileService", "FictionArchive.Service.FileService\FictionArchive.Service.FileService.csproj", "{EC64A336-F8A0-4BED-9CA3-1B05AD00631D}"
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "FictionArchive.Service.NovelService.Tests", "FictionArchive.Service.NovelService.Tests\FictionArchive.Service.NovelService.Tests.csproj", "{166E645E-9DFB-44E8-8CC8-FA249A11679F}"
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "FictionArchive.Service.UserService.Tests", "FictionArchive.Service.UserService.Tests\FictionArchive.Service.UserService.Tests.csproj", "{10C38C89-983D-4544-8911-F03099F66AB8}"
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "FictionArchive.Service.UserNovelDataService", "FictionArchive.Service.UserNovelDataService\FictionArchive.Service.UserNovelDataService.csproj", "{A278565B-D440-4AB9-B2E2-41BA3B3AD82A}"
EndProject
Global
GlobalSection(SolutionConfigurationPlatforms) = preSolution
Debug|Any CPU = Debug|Any CPU
@@ -54,10 +57,6 @@ Global
{EE4D4795-2F79-4614-886D-AF8DA77120AC}.Debug|Any CPU.Build.0 = Debug|Any CPU
{EE4D4795-2F79-4614-886D-AF8DA77120AC}.Release|Any CPU.ActiveCfg = Release|Any CPU
{EE4D4795-2F79-4614-886D-AF8DA77120AC}.Release|Any CPU.Build.0 = Release|Any CPU
{70C4AE82-B01E-421D-B590-C0F47E63CD0C}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{70C4AE82-B01E-421D-B590-C0F47E63CD0C}.Debug|Any CPU.Build.0 = Debug|Any CPU
{70C4AE82-B01E-421D-B590-C0F47E63CD0C}.Release|Any CPU.ActiveCfg = Release|Any CPU
{70C4AE82-B01E-421D-B590-C0F47E63CD0C}.Release|Any CPU.Build.0 = Release|Any CPU
{EC64A336-F8A0-4BED-9CA3-1B05AD00631D}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{EC64A336-F8A0-4BED-9CA3-1B05AD00631D}.Debug|Any CPU.Build.0 = Debug|Any CPU
{EC64A336-F8A0-4BED-9CA3-1B05AD00631D}.Release|Any CPU.ActiveCfg = Release|Any CPU
@@ -66,5 +65,13 @@ Global
{166E645E-9DFB-44E8-8CC8-FA249A11679F}.Debug|Any CPU.Build.0 = Debug|Any CPU
{166E645E-9DFB-44E8-8CC8-FA249A11679F}.Release|Any CPU.ActiveCfg = Release|Any CPU
{166E645E-9DFB-44E8-8CC8-FA249A11679F}.Release|Any CPU.Build.0 = Release|Any CPU
{10C38C89-983D-4544-8911-F03099F66AB8}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{10C38C89-983D-4544-8911-F03099F66AB8}.Debug|Any CPU.Build.0 = Debug|Any CPU
{10C38C89-983D-4544-8911-F03099F66AB8}.Release|Any CPU.ActiveCfg = Release|Any CPU
{10C38C89-983D-4544-8911-F03099F66AB8}.Release|Any CPU.Build.0 = Release|Any CPU
{A278565B-D440-4AB9-B2E2-41BA3B3AD82A}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{A278565B-D440-4AB9-B2E2-41BA3B3AD82A}.Debug|Any CPU.Build.0 = Debug|Any CPU
{A278565B-D440-4AB9-B2E2-41BA3B3AD82A}.Release|Any CPU.ActiveCfg = Release|Any CPU
{A278565B-D440-4AB9-B2E2-41BA3B3AD82A}.Release|Any CPU.Build.0 = Release|Any CPU
EndGlobalSection
EndGlobal

View File

@@ -4,25 +4,34 @@ services:
# ===========================================
postgres:
image: postgres:16-alpine
networks:
fictionarchive:
ipv4_address: 172.20.0.10
environment:
POSTGRES_USER: ${POSTGRES_USER:-postgres}
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD:-postgres}
volumes:
- postgres_data:/var/lib/postgresql/data
- /srv/docker_volumes/fictionarchive/postgres:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U postgres"]
interval: 5s
timeout: 5s
retries: 5
restart: unless-stopped
ports:
- 4321:5432
rabbitmq:
image: rabbitmq:3-management-alpine
networks:
fictionarchive:
ipv4_address: 172.20.0.11
environment:
RABBITMQ_DEFAULT_USER: ${RABBITMQ_USER:-guest}
RABBITMQ_DEFAULT_PASS: ${RABBITMQ_PASSWORD:-guest}
RABBITMQ_SERVER_ADDITIONAL_ERL_ARGS: -rabbit max_message_size 536870912
volumes:
- rabbitmq_data:/var/lib/rabbitmq
- /srv/docker_volumes/fictionarchive/rabbitmq:/var/lib/rabbitmq
healthcheck:
test: ["CMD", "rabbitmq-diagnostics", "check_running"]
interval: 10s
@@ -30,6 +39,37 @@ services:
retries: 5
restart: unless-stopped
# ===========================================
# VPN Container
# ===========================================
vpn:
image: dperson/openvpn-client
networks:
fictionarchive:
ipv4_address: 172.20.0.20
aliases:
- novel-service
cap_add:
- NET_ADMIN
devices:
- /dev/net/tun
volumes:
- /srv/docker_volumes/korean_vpn:/vpn
dns:
- 192.168.3.1
environment:
- DNS=1.1.1.1,8.8.8.8
extra_hosts:
- "postgres:172.20.0.10"
- "rabbitmq:172.20.0.11"
healthcheck:
test: ["CMD", "ping", "-c", "1", "-W", "5", "1.1.1.1"]
interval: 30s
timeout: 10s
retries: 3
start_period: 30s
restart: unless-stopped
# ===========================================
# Backend Services
# ===========================================
@@ -37,50 +77,27 @@ services:
image: git.orfl.xyz/conco/fictionarchive-novel-service:latest
environment:
ConnectionStrings__DefaultConnection: Host=postgres;Database=FictionArchive_NovelService;Username=${POSTGRES_USER:-postgres};Password=${POSTGRES_PASSWORD:-postgres}
ConnectionStrings__RabbitMQ: amqp://${RABBITMQ_USER:-guest}:${RABBITMQ_PASSWORD:-guest}@rabbitmq
RabbitMQ__ConnectionString: amqp://${RABBITMQ_USER:-guest}:${RABBITMQ_PASSWORD:-guest}@rabbitmq
Novelpia__Username: ${NOVELPIA_USERNAME}
Novelpia__Password: ${NOVELPIA_PASSWORD}
NovelUpdateService__PendingImageUrl: https://files.fictionarchive.orfl.xyz/api/pendingupload.png
healthcheck:
test: ["CMD", "wget", "--no-verbose", "--tries=1", "--spider", "http://localhost:8080/healthz"]
interval: 30s
timeout: 10s
retries: 3
UpdateService__PendingImageUrl: https://files.fictionarchive.orfl.xyz/api/pendingupload.png
depends_on:
postgres:
condition: service_healthy
rabbitmq:
condition: service_healthy
restart: unless-stopped
translation-service:
image: git.orfl.xyz/conco/fictionarchive-translation-service:latest
environment:
ConnectionStrings__DefaultConnection: Host=postgres;Database=FictionArchive_TranslationService;Username=${POSTGRES_USER:-postgres};Password=${POSTGRES_PASSWORD:-postgres}
ConnectionStrings__RabbitMQ: amqp://${RABBITMQ_USER:-guest}:${RABBITMQ_PASSWORD:-guest}@rabbitmq
DeepL__ApiKey: ${DEEPL_API_KEY}
healthcheck:
test: ["CMD", "wget", "--no-verbose", "--tries=1", "--spider", "http://localhost:8080/healthz"]
interval: 30s
timeout: 10s
retries: 3
depends_on:
postgres:
condition: service_healthy
rabbitmq:
vpn:
condition: service_healthy
network_mode: "service:vpn"
restart: unless-stopped
scheduler-service:
image: git.orfl.xyz/conco/fictionarchive-scheduler-service:latest
networks:
- fictionarchive
environment:
ConnectionStrings__DefaultConnection: Host=postgres;Database=FictionArchive_SchedulerService;Username=${POSTGRES_USER:-postgres};Password=${POSTGRES_PASSWORD:-postgres}
ConnectionStrings__RabbitMQ: amqp://${RABBITMQ_USER:-guest}:${RABBITMQ_PASSWORD:-guest}@rabbitmq
healthcheck:
test: ["CMD", "wget", "--no-verbose", "--tries=1", "--spider", "http://localhost:8080/healthz"]
interval: 30s
timeout: 10s
retries: 3
RabbitMQ__ConnectionString: amqp://${RABBITMQ_USER:-guest}:${RABBITMQ_PASSWORD:-guest}@rabbitmq
depends_on:
postgres:
condition: service_healthy
@@ -90,14 +107,14 @@ services:
user-service:
image: git.orfl.xyz/conco/fictionarchive-user-service:latest
networks:
- fictionarchive
environment:
ConnectionStrings__DefaultConnection: Host=postgres;Database=FictionArchive_UserService;Username=${POSTGRES_USER:-postgres};Password=${POSTGRES_PASSWORD:-postgres}
ConnectionStrings__RabbitMQ: amqp://${RABBITMQ_USER:-guest}:${RABBITMQ_PASSWORD:-guest}@rabbitmq
healthcheck:
test: ["CMD", "wget", "--no-verbose", "--tries=1", "--spider", "http://localhost:8080/healthz"]
interval: 30s
timeout: 10s
retries: 3
RabbitMQ__ConnectionString: amqp://${RABBITMQ_USER:-guest}:${RABBITMQ_PASSWORD:-guest}@rabbitmq
Authentik__BaseUrl: https://auth.orfl.xyz
Authentik__ApiToken: ${AUTHENTIK_API_TOKEN}
Authentik__EmailStageId: 10df0c18-8802-4ec7-852e-3cdd355514d3
depends_on:
postgres:
condition: service_healthy
@@ -105,42 +122,35 @@ services:
condition: service_healthy
restart: unless-stopped
authentication-service:
image: git.orfl.xyz/conco/fictionarchive-authentication-service:latest
usernoveldata-service:
image: git.orfl.xyz/conco/fictionarchive-usernoveldata-service:latest
networks:
- fictionarchive
environment:
ConnectionStrings__RabbitMQ: amqp://${RABBITMQ_USER:-guest}:${RABBITMQ_PASSWORD:-guest}@rabbitmq
healthcheck:
test: ["CMD", "wget", "--no-verbose", "--tries=1", "--spider", "http://localhost:8080/healthz"]
interval: 30s
timeout: 10s
retries: 3
ConnectionStrings__DefaultConnection: Host=postgres;Database=FictionArchive_UserNovelDataService;Username=${POSTGRES_USER:-postgres};Password=${POSTGRES_PASSWORD:-postgres}
RabbitMQ__ConnectionString: amqp://${RABBITMQ_USER:-guest}:${RABBITMQ_PASSWORD:-guest}@rabbitmq
depends_on:
postgres:
condition: service_healthy
rabbitmq:
condition: service_healthy
restart: unless-stopped
file-service:
image: git.orfl.xyz/conco/fictionarchive-file-service:latest
networks:
- web
- fictionarchive
environment:
ConnectionStrings__RabbitMQ: amqp://${RABBITMQ_USER:-guest}:${RABBITMQ_PASSWORD:-guest}@rabbitmq
S3__Endpoint: ${S3_ENDPOINT:-https://s3.orfl.xyz}
S3__Bucket: ${S3_BUCKET:-fictionarchive}
RabbitMQ__ConnectionString: amqp://${RABBITMQ_USER:-guest}:${RABBITMQ_PASSWORD:-guest}@rabbitmq
S3__AccessKey: ${S3_ACCESS_KEY}
S3__SecretKey: ${S3_SECRET_KEY}
Proxy__BaseUrl: https://files.orfl.xyz/api
OIDC__Authority: https://auth.orfl.xyz/application/o/fictionarchive/
OIDC__ClientId: fictionarchive-files
OIDC__Audience: fictionarchive-api
healthcheck:
test: ["CMD", "wget", "--no-verbose", "--tries=1", "--spider", "http://localhost:8080/healthz"]
interval: 30s
timeout: 10s
retries: 3
ProxyConfiguration__BaseUrl: https://files.fictionarchive.orfl.xyz/api
labels:
- "traefik.enable=true"
- "traefik.http.routers.file-service.rule=Host(`files.orfl.xyz`)"
- "traefik.http.routers.file-service.entrypoints=websecure"
- "traefik.http.routers.file-service.tls.certresolver=letsencrypt"
- "traefik.http.routers.file-service.rule=Host(`files.fictionarchive.orfl.xyz`)"
- "traefik.http.routers.file-service.tls=true"
- "traefik.http.routers.file-service.tls.certresolver=lets-encrypt"
- "traefik.http.services.file-service.loadbalancer.server.port=8080"
depends_on:
rabbitmq:
@@ -152,30 +162,23 @@ services:
# ===========================================
api-gateway:
image: git.orfl.xyz/conco/fictionarchive-api:latest
networks:
- web
- fictionarchive
environment:
ConnectionStrings__RabbitMQ: amqp://${RABBITMQ_USER:-guest}:${RABBITMQ_PASSWORD:-guest}@rabbitmq
OIDC__Authority: https://auth.orfl.xyz/application/o/fictionarchive/
OIDC__ClientId: fictionarchive-api
OIDC__Audience: fictionarchive-api
Cors__AllowedOrigin: https://fictionarchive.orfl.xyz
healthcheck:
test: ["CMD", "wget", "--no-verbose", "--tries=1", "--spider", "http://localhost:8080/healthz"]
interval: 30s
timeout: 10s
retries: 3
labels:
- "traefik.enable=true"
- "traefik.http.routers.api-gateway.rule=Host(`api.fictionarchive.orfl.xyz`)"
- "traefik.http.routers.api-gateway.entrypoints=websecure"
- "traefik.http.routers.api-gateway.tls.certresolver=letsencrypt"
- "traefik.http.routers.api-gateway.tls=true"
- "traefik.http.routers.api-gateway.tls.certresolver=lets-encrypt"
- "traefik.http.services.api-gateway.loadbalancer.server.port=8080"
depends_on:
- novel-service
- translation-service
- scheduler-service
- user-service
- authentication-service
- file-service
- user-service
- usernoveldata-service
restart: unless-stopped
# ===========================================
@@ -183,20 +186,21 @@ services:
# ===========================================
frontend:
image: git.orfl.xyz/conco/fictionarchive-frontend:latest
healthcheck:
test: ["CMD", "wget", "--no-verbose", "--tries=1", "--spider", "http://localhost/"]
interval: 30s
timeout: 10s
retries: 3
networks:
- web
labels:
- "traefik.enable=true"
- "traefik.http.routers.frontend.rule=Host(`fictionarchive.orfl.xyz`)"
- "traefik.http.routers.frontend.entrypoints=websecure"
- "traefik.http.routers.frontend.tls.certresolver=letsencrypt"
- "traefik.http.services.frontend.loadbalancer.server.port=80"
- traefik.http.routers.fafrontend.rule=Host(`fictionarchive.orfl.xyz`)
- traefik.http.routers.fafrontend.tls=true
- traefik.http.routers.fafrontend.tls.certresolver=lets-encrypt
- traefik.http.services.fafrontend.loadbalancer.server.port=80
- traefik.enable=true
restart: unless-stopped
volumes:
postgres_data:
rabbitmq_data:
letsencrypt:
networks:
web:
external: yes
fictionarchive:
ipam:
driver: default
config:
- subnet: 172.20.0.0/24

View File

@@ -44,6 +44,12 @@
<div
class="absolute right-0 z-50 mt-2 w-48 rounded-md bg-white p-2 shadow-lg dark:bg-gray-800"
>
<a
href="/settings"
class="flex w-full items-center justify-start rounded-md px-4 py-2 text-sm font-medium hover:bg-gray-100 dark:hover:bg-gray-700"
>
Settings
</a>
<Button variant="ghost" class="w-full justify-start" onclick={handleLogout}>
Sign out
</Button>

View File

@@ -0,0 +1,181 @@
<script lang="ts">
import { Button } from '$lib/components/ui/button';
import { Popover, PopoverTrigger, PopoverContent } from '$lib/components/ui/popover';
import { Textarea } from '$lib/components/ui/textarea';
import { client } from '$lib/graphql/client';
import { UpsertBookmarkDocument, RemoveBookmarkDocument } from '$lib/graphql/__generated__/graphql';
import Bookmark from '@lucide/svelte/icons/bookmark';
import BookmarkCheck from '@lucide/svelte/icons/bookmark-check';
interface Props {
novelId: number;
chapterId: number;
isBookmarked?: boolean;
bookmarkDescription?: string | null;
size?: 'default' | 'sm' | 'icon';
onBookmarkChange?: (isBookmarked: boolean, description?: string | null) => void;
}
let {
novelId,
chapterId,
isBookmarked = false,
bookmarkDescription = null,
size = 'icon',
onBookmarkChange
}: Props = $props();
// Bookmark state
let popoverOpen = $state(false);
let description = $state(bookmarkDescription ?? '');
let saving = $state(false);
let removing = $state(false);
let error: string | null = $state(null);
// Reset description when popover opens
$effect(() => {
if (popoverOpen) {
description = bookmarkDescription ?? '';
error = null;
}
});
async function saveBookmark() {
saving = true;
error = null;
try {
const result = await client
.mutation(UpsertBookmarkDocument, {
input: {
chapterId,
novelId,
description: description.trim() || null
}
})
.toPromise();
if (result.error) {
error = result.error.message;
return;
}
if (result.data?.upsertBookmark?.errors?.length) {
error = result.data.upsertBookmark.errors[0]?.message ?? 'Failed to save bookmark';
return;
}
if (result.data?.upsertBookmark?.bookmarkPayload?.success) {
popoverOpen = false;
onBookmarkChange?.(true, description.trim() || null);
}
} catch (e) {
error = e instanceof Error ? e.message : 'Failed to save bookmark';
} finally {
saving = false;
}
}
async function removeBookmark() {
removing = true;
error = null;
try {
const result = await client
.mutation(RemoveBookmarkDocument, {
input: { chapterId }
})
.toPromise();
if (result.error) {
error = result.error.message;
return;
}
if (result.data?.removeBookmark?.errors?.length) {
error = result.data.removeBookmark.errors[0]?.message ?? 'Failed to remove bookmark';
return;
}
if (result.data?.removeBookmark?.bookmarkPayload?.success) {
popoverOpen = false;
description = '';
onBookmarkChange?.(false, null);
}
} catch (e) {
error = e instanceof Error ? e.message : 'Failed to remove bookmark';
} finally {
removing = false;
}
}
function handleClick(e: MouseEvent) {
e.preventDefault();
e.stopPropagation();
}
</script>
<!-- svelte-ignore a11y_click_events_have_key_events -->
<!-- svelte-ignore a11y_no_static_element_interactions -->
<div onclick={handleClick}>
<Popover bind:open={popoverOpen}>
<PopoverTrigger asChild>
{#snippet child({ props })}
<Button
variant={isBookmarked ? 'default' : 'ghost'}
{size}
class={size === 'icon' ? 'h-8 w-8' : 'gap-2'}
{...props}
>
{#if isBookmarked}
<BookmarkCheck class="h-4 w-4" />
{:else}
<Bookmark class="h-4 w-4" />
{/if}
{#if size !== 'icon'}
<span>{isBookmarked ? 'Bookmarked' : 'Bookmark'}</span>
{/if}
</Button>
{/snippet}
</PopoverTrigger>
<PopoverContent class="w-80">
<div class="space-y-4">
<div class="space-y-2">
<h4 class="font-medium leading-none">
{isBookmarked ? 'Edit bookmark' : 'Bookmark this chapter'}
</h4>
<p class="text-sm text-muted-foreground">
{isBookmarked ? 'Update your note or remove the bookmark.' : 'Add an optional note to remember why you bookmarked this.'}
</p>
</div>
<Textarea
bind:value={description}
placeholder="Add a note..."
class="min-h-[80px] resize-none"
/>
{#if error}
<p class="text-sm text-destructive">{error}</p>
{/if}
<div class="flex justify-end gap-2">
{#if isBookmarked}
<Button
variant="destructive"
size="sm"
onclick={removeBookmark}
disabled={removing || saving}
>
{removing ? 'Removing...' : 'Remove'}
</Button>
{/if}
<Button
size="sm"
onclick={saveBookmark}
disabled={saving || removing}
>
{saving ? 'Saving...' : 'Save'}
</Button>
</div>
</div>
</PopoverContent>
</Popover>
</div>

View File

@@ -1,27 +1,138 @@
<script lang="ts">
import { Button } from '$lib/components/ui/button';
import { Popover, PopoverTrigger, PopoverContent } from '$lib/components/ui/popover';
import { Textarea } from '$lib/components/ui/textarea';
import { client } from '$lib/graphql/client';
import { UpsertBookmarkDocument, RemoveBookmarkDocument } from '$lib/graphql/__generated__/graphql';
import ChevronLeft from '@lucide/svelte/icons/chevron-left';
import ChevronRight from '@lucide/svelte/icons/chevron-right';
import List from '@lucide/svelte/icons/list';
import Bookmark from '@lucide/svelte/icons/bookmark';
import BookmarkCheck from '@lucide/svelte/icons/bookmark-check';
interface Props {
novelId: string;
chapterId?: number;
prevChapterVolumeOrder: number | null | undefined;
prevChapterOrder: number | null | undefined;
nextChapterVolumeOrder: number | null | undefined;
nextChapterOrder: number | null | undefined;
showKeyboardHints?: boolean;
isBookmarked?: boolean;
bookmarkDescription?: string | null;
onBookmarkChange?: (isBookmarked: boolean, description?: string | null) => void;
}
let { novelId, prevChapterOrder, nextChapterOrder, showKeyboardHints = true }: Props = $props();
let {
novelId,
chapterId,
prevChapterVolumeOrder,
prevChapterOrder,
nextChapterVolumeOrder,
nextChapterOrder,
showKeyboardHints = true,
isBookmarked = false,
bookmarkDescription = null,
onBookmarkChange
}: Props = $props();
const hasPrev = $derived(prevChapterOrder != null);
const hasNext = $derived(nextChapterOrder != null);
const hasPrev = $derived(prevChapterOrder != null && prevChapterVolumeOrder != null);
const hasNext = $derived(nextChapterOrder != null && nextChapterVolumeOrder != null);
// Bookmark state
let popoverOpen = $state(false);
let description = $state(bookmarkDescription ?? '');
let saving = $state(false);
let removing = $state(false);
let error: string | null = $state(null);
// Reset description when popover opens
$effect(() => {
if (popoverOpen) {
description = bookmarkDescription ?? '';
error = null;
}
});
async function saveBookmark() {
if (!chapterId) return;
saving = true;
error = null;
try {
const result = await client
.mutation(UpsertBookmarkDocument, {
input: {
chapterId,
novelId: parseInt(novelId, 10),
description: description.trim() || null
}
})
.toPromise();
if (result.error) {
error = result.error.message;
return;
}
if (result.data?.upsertBookmark?.errors?.length) {
error = result.data.upsertBookmark.errors[0]?.message ?? 'Failed to save bookmark';
return;
}
if (result.data?.upsertBookmark?.bookmarkPayload?.success) {
popoverOpen = false;
onBookmarkChange?.(true, description.trim() || null);
}
} catch (e) {
error = e instanceof Error ? e.message : 'Failed to save bookmark';
} finally {
saving = false;
}
}
async function removeBookmark() {
if (!chapterId) return;
removing = true;
error = null;
try {
const result = await client
.mutation(RemoveBookmarkDocument, {
input: { chapterId }
})
.toPromise();
if (result.error) {
error = result.error.message;
return;
}
if (result.data?.removeBookmark?.errors?.length) {
error = result.data.removeBookmark.errors[0]?.message ?? 'Failed to remove bookmark';
return;
}
if (result.data?.removeBookmark?.bookmarkPayload?.success) {
popoverOpen = false;
description = '';
onBookmarkChange?.(false, null);
}
} catch (e) {
error = e instanceof Error ? e.message : 'Failed to remove bookmark';
} finally {
removing = false;
}
}
</script>
<div class="flex flex-col gap-2">
<div class="flex items-center justify-between gap-4">
<Button
variant="outline"
href={hasPrev ? `/novels/${novelId}/chapters/${prevChapterOrder}` : undefined}
href={hasPrev ? `/novels/${novelId}/volumes/${prevChapterVolumeOrder}/chapters/${prevChapterOrder}` : undefined}
disabled={!hasPrev}
class="gap-2"
>
@@ -29,14 +140,76 @@
<span class="hidden sm:inline">Previous</span>
</Button>
<div class="flex items-center gap-2">
<Button variant="outline" href="/novels/{novelId}" class="gap-2">
<List class="h-4 w-4" />
<span class="hidden sm:inline">Contents</span>
</Button>
{#if chapterId}
<Popover bind:open={popoverOpen}>
<PopoverTrigger asChild>
{#snippet child({ props })}
<Button
variant={isBookmarked ? 'default' : 'outline'}
class="gap-2"
{...props}
>
{#if isBookmarked}
<BookmarkCheck class="h-4 w-4" />
{:else}
<Bookmark class="h-4 w-4" />
{/if}
<span class="hidden sm:inline">{isBookmarked ? 'Bookmarked' : 'Bookmark'}</span>
</Button>
{/snippet}
</PopoverTrigger>
<PopoverContent class="w-80">
<div class="space-y-4">
<div class="space-y-2">
<h4 class="font-medium leading-none">
{isBookmarked ? 'Edit bookmark' : 'Bookmark this chapter'}
</h4>
<p class="text-sm text-muted-foreground">
{isBookmarked ? 'Update your note or remove the bookmark.' : 'Add an optional note to remember why you bookmarked this.'}
</p>
</div>
<Textarea
bind:value={description}
placeholder="Add a note..."
class="min-h-[80px] resize-none"
/>
{#if error}
<p class="text-sm text-destructive">{error}</p>
{/if}
<div class="flex justify-end gap-2">
{#if isBookmarked}
<Button
variant="destructive"
size="sm"
onclick={removeBookmark}
disabled={removing || saving}
>
{removing ? 'Removing...' : 'Remove'}
</Button>
{/if}
<Button
size="sm"
onclick={saveBookmark}
disabled={saving || removing}
>
{saving ? 'Saving...' : 'Save'}
</Button>
</div>
</div>
</PopoverContent>
</Popover>
{/if}
</div>
<Button
variant="outline"
href={hasNext ? `/novels/${novelId}/chapters/${nextChapterOrder}` : undefined}
href={hasNext ? `/novels/${novelId}/volumes/${nextChapterVolumeOrder}/chapters/${nextChapterOrder}` : undefined}
disabled={!hasNext}
class="gap-2"
>

View File

@@ -1,13 +1,14 @@
<script lang="ts" module>
import type { GetChapterQuery } from '$lib/graphql/__generated__/graphql';
import type { GetChapterQuery, GetBookmarksQuery } from '$lib/graphql/__generated__/graphql';
export type ChapterData = NonNullable<GetChapterQuery['chapter']>;
export type BookmarkData = GetBookmarksQuery['bookmarks'][number];
</script>
<script lang="ts">
import { onMount, onDestroy } from 'svelte';
import { client } from '$lib/graphql/client';
import { GetChapterDocument } from '$lib/graphql/__generated__/graphql';
import { GetChapterDocument, GetBookmarksDocument } from '$lib/graphql/__generated__/graphql';
import { Card, CardContent } from '$lib/components/ui/card';
import { Button } from '$lib/components/ui/button';
import ChapterNavigation from './ChapterNavigation.svelte';
@@ -16,10 +17,11 @@
interface Props {
novelId?: string;
volumeOrder?: string;
chapterNumber?: string;
}
let { novelId, chapterNumber }: Props = $props();
let { novelId, volumeOrder, chapterNumber }: Props = $props();
// State
let chapter: ChapterData | null = $state(null);
@@ -27,6 +29,10 @@
let error: string | null = $state(null);
let scrollProgress = $state(0);
// Bookmark state
let isBookmarked = $state(false);
let bookmarkDescription: string | null = $state(null);
// Derived values
const sanitizedBody = $derived(chapter?.body ? sanitizeChapterHtml(chapter.body) : '');
@@ -42,16 +48,16 @@
return;
}
if (event.key === 'ArrowLeft' && chapter?.prevChapterOrder != null) {
window.location.href = `/novels/${novelId}/chapters/${chapter.prevChapterOrder}`;
} else if (event.key === 'ArrowRight' && chapter?.nextChapterOrder != null) {
window.location.href = `/novels/${novelId}/chapters/${chapter.nextChapterOrder}`;
if (event.key === 'ArrowLeft' && chapter?.prevChapterOrder != null && chapter?.prevChapterVolumeOrder != null) {
window.location.href = `/novels/${novelId}/volumes/${chapter.prevChapterVolumeOrder}/chapters/${chapter.prevChapterOrder}`;
} else if (event.key === 'ArrowRight' && chapter?.nextChapterOrder != null && chapter?.nextChapterVolumeOrder != null) {
window.location.href = `/novels/${novelId}/volumes/${chapter.nextChapterVolumeOrder}/chapters/${chapter.nextChapterOrder}`;
}
}
async function fetchChapter() {
if (!novelId || !chapterNumber) {
error = 'Missing novel ID or chapter number';
if (!novelId || !volumeOrder || !chapterNumber) {
error = 'Missing novel ID, volume order, or chapter number';
fetching = false;
return;
}
@@ -63,6 +69,7 @@
const result = await client
.query(GetChapterDocument, {
novelId: parseInt(novelId, 10),
volumeOrder: parseInt(volumeOrder, 10),
chapterOrder: parseInt(chapterNumber, 10)
})
.toPromise();
@@ -74,6 +81,10 @@
if (result.data?.chapter) {
chapter = result.data.chapter;
// Update the page title with chapter info
document.title = `${chapter.novelName} - ${chapter.order}`;
// Fetch bookmark status
await fetchBookmarks();
} else {
error = 'Chapter not found';
}
@@ -84,6 +95,34 @@
}
}
async function fetchBookmarks() {
if (!novelId || !chapter) return;
try {
const result = await client
.query(GetBookmarksDocument, { novelId: parseInt(novelId, 10) })
.toPromise();
if (result.data?.bookmarks) {
const bookmark = result.data.bookmarks.find((b) => b.chapterId === chapter!.id);
if (bookmark) {
isBookmarked = true;
bookmarkDescription = bookmark.description ?? null;
} else {
isBookmarked = false;
bookmarkDescription = null;
}
}
} catch {
// Silently fail - bookmark status is non-critical
}
}
function handleBookmarkChange(newIsBookmarked: boolean, newDescription?: string | null) {
isBookmarked = newIsBookmarked;
bookmarkDescription = newDescription ?? null;
}
onMount(() => {
fetchChapter();
window.addEventListener('scroll', handleScroll, { passive: true });
@@ -135,8 +174,14 @@
<!-- Navigation (top) -->
<ChapterNavigation
novelId={novelId ?? ''}
chapterId={chapter.id}
prevChapterVolumeOrder={chapter.prevChapterVolumeOrder}
prevChapterOrder={chapter.prevChapterOrder}
nextChapterVolumeOrder={chapter.nextChapterVolumeOrder}
nextChapterOrder={chapter.nextChapterOrder}
{isBookmarked}
{bookmarkDescription}
onBookmarkChange={handleBookmarkChange}
/>
<!-- Chapter Header -->
@@ -167,9 +212,15 @@
<!-- Navigation (bottom) -->
<ChapterNavigation
novelId={novelId ?? ''}
chapterId={chapter.id}
prevChapterVolumeOrder={chapter.prevChapterVolumeOrder}
prevChapterOrder={chapter.prevChapterOrder}
nextChapterVolumeOrder={chapter.nextChapterVolumeOrder}
nextChapterOrder={chapter.nextChapterOrder}
showKeyboardHints={false}
{isBookmarked}
{bookmarkDescription}
onBookmarkChange={handleBookmarkChange}
/>
{/if}
</div>

View File

@@ -1,7 +1,7 @@
<script lang="ts">
import { Input } from '$lib/components/ui/input';
import * as NavigationMenu from '$lib/components/ui/navigation-menu';
import AuthenticationDisplay from './AuthenticationDisplay.svelte';
import SearchBar from './SearchBar.svelte';
let pathname = $state(typeof window !== 'undefined' ? window.location.pathname : '/');
@@ -27,7 +27,7 @@
</NavigationMenu.List>
</NavigationMenu.Root>
<div class="flex-1"></div>
<Input type="search" placeholder="Search..." class="max-w-xs" />
<SearchBar />
<AuthenticationDisplay />
</nav>
</header>

View File

@@ -1,5 +1,6 @@
<script lang="ts" module>
import type { NovelsQuery, NovelStatus } from '$lib/graphql/__generated__/graphql';
import { SystemTags } from '$lib/constants/systemTags';
export type NovelNode = NonNullable<NonNullable<NovelsQuery['novels']>['edges']>[number]['node'];
@@ -55,6 +56,8 @@
const status = $derived(novel.rawStatus ?? 'UNKNOWN');
const statusColor = $derived(statusColors[status]);
const statusLabel = $derived(statusLabels[status]);
const isNsfw = $derived(novel.tags?.some((tag) => tag.key === SystemTags.Nsfw) ?? false);
</script>
<a
@@ -76,6 +79,9 @@
>
{statusLabel}
</Badge>
{#if isNsfw}
<Badge class="absolute top-9 right-2 bg-red-600 text-white shadow-sm">NSFW</Badge>
{/if}
</div>
<CardHeader class="space-y-2 pt-4">
<CardTitle class="line-clamp-2 text-lg leading-tight" title={title}>

View File

@@ -1,5 +1,9 @@
<script lang="ts" module>
import type { NovelQuery, NovelStatus, Language } from '$lib/graphql/__generated__/graphql';
import type { NovelQuery, NovelStatus, Language, GetBookmarksQuery } from '$lib/graphql/__generated__/graphql';
import { TagType } from '$lib/graphql/__generated__/graphql';
import { SystemTags } from '$lib/constants/systemTags';
export type BookmarkData = GetBookmarksQuery['bookmarks'][number];
export type NovelNode = NonNullable<NonNullable<NovelQuery['novels']>['nodes']>[number];
@@ -30,12 +34,18 @@
<script lang="ts">
import { onMount } from 'svelte';
import { client } from '$lib/graphql/client';
import { NovelDocument, ImportNovelDocument } from '$lib/graphql/__generated__/graphql';
import { NovelDocument, ImportNovelDocument, DeleteNovelDocument, GetBookmarksDocument } from '$lib/graphql/__generated__/graphql';
import { isAuthenticated } from '$lib/auth/authStore';
import { Card, CardContent, CardHeader } from '$lib/components/ui/card';
import { Badge } from '$lib/components/ui/badge';
import { Button } from '$lib/components/ui/button';
import { Tabs, TabsList, TabsTrigger, TabsContent } from '$lib/components/ui/tabs';
import {
Accordion,
AccordionItem,
AccordionTrigger,
AccordionContent
} from '$lib/components/ui/accordion';
import {
Tooltip,
TooltipTrigger,
@@ -44,6 +54,7 @@
} from '$lib/components/ui/tooltip';
import { formatRelativeTime, formatAbsoluteTime } from '$lib/utils/time';
import { sanitizeHtml } from '$lib/utils/sanitize';
import ChapterBookmarkButton from './ChapterBookmarkButton.svelte';
// Direct imports for faster builds
import ArrowLeft from '@lucide/svelte/icons/arrow-left';
import ExternalLink from '@lucide/svelte/icons/external-link';
@@ -51,6 +62,7 @@
import ChevronDown from '@lucide/svelte/icons/chevron-down';
import ChevronUp from '@lucide/svelte/icons/chevron-up';
import RefreshCw from '@lucide/svelte/icons/refresh-cw';
import Trash2 from '@lucide/svelte/icons/trash-2';
import X from '@lucide/svelte/icons/x';
import ChevronLeft from '@lucide/svelte/icons/chevron-left';
import ChevronRight from '@lucide/svelte/icons/chevron-right';
@@ -69,10 +81,16 @@
let refreshError: string | null = $state(null);
let refreshSuccess = $state(false);
// Delete state
let showDeleteConfirm = $state(false);
let deleting = $state(false);
let deleteError: string | null = $state(null);
// Image viewer state
type GalleryImage = {
src: string;
alt: string;
volumeOrder?: number;
chapterId?: number;
chapterOrder?: number;
chapterName?: string;
@@ -80,6 +98,13 @@
};
let viewerOpen = $state(false);
let viewerIndex = $state(0);
let activeTab = $state('chapters');
let galleryLoaded = $state(false);
// Bookmarks state
let bookmarks: BookmarkData[] = $state([]);
let bookmarksLoaded = $state(false);
let bookmarksFetching = $state(false);
const DESCRIPTION_PREVIEW_LENGTH = 300;
@@ -104,14 +129,57 @@
: descriptionHtml
);
const sortedChapters = $derived(
[...(novel?.chapters ?? [])].sort((a, b) => a.order - b.order)
// Volume-aware chapter organization
const sortedVolumes = $derived(
[...(novel?.volumes ?? [])].sort((a, b) => a.order - b.order)
);
const chapterCount = $derived(novel?.chapters?.length ?? 0);
const isSingleVolume = $derived(sortedVolumes.length === 1);
// Chapter lookup for bookmarks (maps chapterId to chapter details)
const chapterLookup = $derived(
new Map(
sortedVolumes.flatMap((v) =>
v.chapters.map((c) => [c.id, { ...c, volumeOrder: v.order }])
)
)
);
// Bookmark lookup by chapterId for quick access in chapter list
const bookmarkLookup = $derived(
new Map(bookmarks.map((b) => [b.chapterId, b]))
);
function handleChapterBookmarkChange(chapterId: number, isBookmarked: boolean, description?: string | null) {
if (isBookmarked) {
// Add or update bookmark in local state
const existingIndex = bookmarks.findIndex((b) => b.chapterId === chapterId);
const newBookmark = {
id: existingIndex >= 0 ? bookmarks[existingIndex].id : -1, // temp id
chapterId,
description: description ?? null,
createdTime: new Date().toISOString()
};
if (existingIndex >= 0) {
bookmarks[existingIndex] = newBookmark;
} else {
bookmarks = [...bookmarks, newBookmark];
}
} else {
// Remove bookmark from local state
bookmarks = bookmarks.filter((b) => b.chapterId !== chapterId);
}
}
const chapterCount = $derived(
sortedVolumes.reduce((sum, v) => sum + v.chapters.length, 0)
);
// Filter out system tags for display, check for NSFW
const displayTags = $derived(novel?.tags?.filter((tag) => tag.tagType !== TagType.System) ?? []);
const isNsfw = $derived(novel?.tags?.some((tag) => tag.key === SystemTags.Nsfw) ?? false);
const canRefresh = $derived(() => {
if (status === 'COMPLETED') return false;
if (!lastUpdated) return true;
const sixHoursAgo = Date.now() - 6 * 60 * 60 * 1000;
return lastUpdated.getTime() < sixHoursAgo;
@@ -126,13 +194,16 @@
images.push({ src: coverSrc, alt: `${novel.name} cover`, isCover: true });
}
// Add chapter images
for (const chapter of sortedChapters) {
// Add chapter images (loop through volumes to preserve volumeOrder)
for (const volume of sortedVolumes) {
const volumeChapters = [...volume.chapters].sort((a, b) => a.order - b.order);
for (const chapter of volumeChapters) {
for (const img of chapter.images ?? []) {
if (img.newPath) {
images.push({
src: img.newPath,
alt: `Image from ${chapter.name}`,
volumeOrder: volume.order,
chapterId: chapter.id,
chapterOrder: chapter.order,
chapterName: chapter.name,
@@ -141,11 +212,48 @@
}
}
}
}
return images;
});
const currentImage = $derived(galleryImages[viewerIndex]);
const imageCount = $derived(galleryImages.length);
// Load gallery images when tab is first activated
$effect(() => {
if (activeTab === 'gallery' && !galleryLoaded) {
galleryLoaded = true;
}
});
// Load bookmarks when novel is loaded (for count display)
$effect(() => {
if (novel && !bookmarksLoaded && novelId) {
fetchBookmarks();
}
});
async function fetchBookmarks() {
if (!novelId || bookmarksFetching) return;
bookmarksFetching = true;
try {
const result = await client
.query(GetBookmarksDocument, { novelId: parseInt(novelId, 10) })
.toPromise();
if (result.data?.bookmarks) {
bookmarks = result.data.bookmarks;
}
} catch {
// Silently fail - bookmarks are non-critical
} finally {
bookmarksFetching = false;
bookmarksLoaded = true;
}
}
// Image viewer functions
function openImageViewer(index: number) {
@@ -199,6 +307,7 @@
const nodes = result.data?.novels?.nodes;
if (nodes && nodes.length > 0) {
novel = nodes[0];
document.title = novel.name;
} else {
error = 'Novel not found';
}
@@ -234,6 +343,32 @@
}
}
async function deleteNovel() {
if (!novel) return;
deleting = true;
deleteError = null;
try {
const result = await client
.mutation(DeleteNovelDocument, { input: { novelId: novel.id } })
.toPromise();
if (result.error) {
deleteError = result.error.message;
} else if (result.data?.deleteNovel?.errors?.length) {
deleteError = result.data.deleteNovel.errors[0].message;
} else {
// Successfully deleted - redirect to novels list
window.location.href = '/novels';
}
} catch (e) {
deleteError = e instanceof Error ? e.message : 'Failed to delete';
} finally {
deleting = false;
}
}
onMount(() => {
fetchNovel();
});
@@ -321,6 +456,9 @@
<!-- Badges -->
<div class="flex flex-wrap gap-2 items-center">
<Badge class={statusColor}>{statusLabel}</Badge>
{#if isNsfw}
<Badge class="bg-red-600 text-white">NSFW</Badge>
{/if}
<Badge variant="outline">{languageLabel}</Badge>
{#if $isAuthenticated}
<TooltipProvider>
@@ -339,11 +477,20 @@
</TooltipTrigger>
{#if !canRefresh()}
<TooltipContent>
{status === 'COMPLETED' ? 'Cannot refresh completed novels' : 'Updated less than 6 hours ago'}
Updated less than 6 hours ago
</TooltipContent>
{/if}
</Tooltip>
</TooltipProvider>
<Button
variant="destructive"
size="sm"
onclick={() => (showDeleteConfirm = true)}
class="gap-1.5 h-6 text-xs"
>
<Trash2 class="h-3 w-3" />
Delete
</Button>
{/if}
{#if refreshSuccess}
<Badge variant="outline" class="bg-green-500/10 text-green-600 border-green-500/30">
@@ -390,9 +537,9 @@
</div>
<!-- Tags -->
{#if novel.tags && novel.tags.length > 0}
{#if displayTags.length > 0}
<div class="flex flex-wrap gap-1.5 pt-1">
{#each novel.tags as tag (tag.key)}
{#each displayTags as tag (tag.key)}
<Badge
variant="secondary"
href="/novels?tags={tag.key}"
@@ -435,61 +582,130 @@
<!-- Tabbed Content -->
<Card>
<Tabs value="chapters" class="w-full">
<Tabs bind:value={activeTab} class="w-full">
<CardHeader class="pb-0">
<TabsList class="grid w-full grid-cols-3 bg-muted/50 p-1 rounded-lg">
<TabsTrigger
value="chapters"
class="rounded-md data-[state=active]:bg-background data-[state=active]:shadow-sm px-3 py-1.5 text-sm font-medium transition-all"
>
Chapters
Chapters ({chapterCount})
</TabsTrigger>
<TabsTrigger
value="gallery"
class="rounded-md data-[state=active]:bg-background data-[state=active]:shadow-sm px-3 py-1.5 text-sm font-medium transition-all"
>
Gallery
Gallery ({imageCount})
</TabsTrigger>
<TabsTrigger
value="bookmarks"
disabled
class="rounded-md data-[state=active]:bg-background data-[state=active]:shadow-sm px-3 py-1.5 text-sm font-medium transition-all disabled:opacity-50 disabled:cursor-not-allowed"
class="rounded-md data-[state=active]:bg-background data-[state=active]:shadow-sm px-3 py-1.5 text-sm font-medium transition-all"
>
Bookmarks
Bookmarks{bookmarksLoaded ? ` (${bookmarks.length})` : ''}
</TabsTrigger>
</TabsList>
</CardHeader>
<CardContent class="pt-4">
<TabsContent value="chapters" class="mt-0">
{#if sortedChapters.length === 0}
{#if chapterCount === 0}
<p class="text-muted-foreground text-sm py-4 text-center">
No chapters available yet.
</p>
{:else}
{:else if isSingleVolume}
<!-- Single volume: flat chapter list -->
{@const singleVolumeChapters = [...(sortedVolumes[0]?.chapters ?? [])].sort((a, b) => a.order - b.order)}
<div class="max-h-96 overflow-y-auto -mx-2">
{#each sortedChapters as chapter (chapter.id)}
{#each singleVolumeChapters as chapter (chapter.id)}
{@const chapterDate = chapter.lastUpdatedTime ? new Date(chapter.lastUpdatedTime) : null}
{@const chapterBookmark = bookmarkLookup.get(chapter.id)}
<div class="flex items-center px-3 py-2.5 hover:bg-muted/50 rounded-md transition-colors group">
<a
href="/novels/{novelId}/chapters/{chapter.order}"
class="flex items-center justify-between px-3 py-2.5 hover:bg-muted/50 rounded-md transition-colors group"
href="/novels/{novelId}/volumes/{sortedVolumes[0]?.order}/chapters/{chapter.order}"
class="flex items-center gap-3 min-w-0 flex-1"
>
<div class="flex items-center gap-3 min-w-0">
<span class="text-muted-foreground text-sm font-medium shrink-0 w-14">
Ch. {chapter.order}
</span>
<span class="text-sm truncate group-hover:text-primary transition-colors">
{chapter.name}
</span>
</div>
</a>
<div class="flex items-center gap-2 shrink-0 ml-2">
{#if chapterDate}
<span class="text-xs text-muted-foreground/70 shrink-0 ml-2">
<span class="text-xs text-muted-foreground/70">
{formatRelativeTime(chapterDate)}
</span>
{/if}
</a>
{#if novelId}
<ChapterBookmarkButton
novelId={parseInt(novelId, 10)}
chapterId={chapter.id}
isBookmarked={!!chapterBookmark}
bookmarkDescription={chapterBookmark?.description}
onBookmarkChange={(isBookmarked, description) => handleChapterBookmarkChange(chapter.id, isBookmarked, description)}
/>
{/if}
</div>
</div>
{/each}
</div>
{:else}
<!-- Multiple volumes: accordion display -->
<div class="max-h-96 overflow-y-auto -mx-2">
<Accordion type="single">
{#each sortedVolumes as volume (volume.id)}
{@const volumeChapters = [...volume.chapters].sort((a, b) => a.order - b.order)}
<AccordionItem value="volume-{volume.id}">
<AccordionTrigger class="px-3">
<div class="flex items-center gap-3">
<span class="font-medium">{volume.name}</span>
<span class="text-xs text-muted-foreground">
({volumeChapters.length} chapters)
</span>
</div>
</AccordionTrigger>
<AccordionContent>
<div class="space-y-0.5">
{#each volumeChapters as chapter (chapter.id)}
{@const chapterDate = chapter.lastUpdatedTime ? new Date(chapter.lastUpdatedTime) : null}
{@const chapterBookmark = bookmarkLookup.get(chapter.id)}
<div class="flex items-center px-3 py-2.5 hover:bg-muted/50 rounded-md transition-colors group">
<a
href="/novels/{novelId}/volumes/{volume.order}/chapters/{chapter.order}"
class="flex items-center gap-3 min-w-0 flex-1"
>
<span class="text-muted-foreground text-sm font-medium shrink-0 w-14">
Ch. {chapter.order}
</span>
<span class="text-sm truncate group-hover:text-primary transition-colors">
{chapter.name}
</span>
</a>
<div class="flex items-center gap-2 shrink-0 ml-2">
{#if chapterDate}
<span class="text-xs text-muted-foreground/70">
{formatRelativeTime(chapterDate)}
</span>
{/if}
{#if novelId}
<ChapterBookmarkButton
novelId={parseInt(novelId, 10)}
chapterId={chapter.id}
isBookmarked={!!chapterBookmark}
bookmarkDescription={chapterBookmark?.description}
onBookmarkChange={(isBookmarked, description) => handleChapterBookmarkChange(chapter.id, isBookmarked, description)}
/>
{/if}
</div>
</div>
{/each}
</div>
</AccordionContent>
</AccordionItem>
{/each}
</Accordion>
</div>
{/if}
</TabsContent>
@@ -498,7 +714,7 @@
<p class="text-muted-foreground text-sm py-4 text-center">
No images available.
</p>
{:else}
{:else if galleryLoaded}
<div class="grid grid-cols-3 sm:grid-cols-4 md:grid-cols-5 gap-2">
{#each galleryImages as image, index (image.src)}
<button
@@ -506,20 +722,68 @@
onclick={() => openImageViewer(index)}
class="relative aspect-square overflow-hidden rounded-md bg-muted/50 hover:ring-2 ring-primary transition-all"
>
<img src={image.src} alt={image.alt} class="h-full w-full object-cover" />
<img src={image.src} alt={image.alt} class="h-full w-full object-cover" loading="lazy" />
{#if image.isCover}
<Badge class="absolute top-1 left-1 text-xs">Cover</Badge>
{/if}
</button>
{/each}
</div>
{:else}
<div class="flex items-center justify-center py-8">
<div
class="border-primary h-8 w-8 animate-spin rounded-full border-2 border-t-transparent"
aria-label="Loading gallery"
></div>
</div>
{/if}
</TabsContent>
<TabsContent value="bookmarks" class="mt-0">
{#if bookmarksFetching}
<div class="flex items-center justify-center py-8">
<div
class="border-primary h-8 w-8 animate-spin rounded-full border-2 border-t-transparent"
aria-label="Loading bookmarks"
></div>
</div>
{:else if bookmarks.length === 0}
<p class="text-muted-foreground text-sm py-8 text-center">
Bookmarks coming soon.
No bookmarks yet. Add bookmarks while reading chapters.
</p>
{:else}
<div class="max-h-96 overflow-y-auto -mx-2">
{#each bookmarks as bookmark (bookmark.id)}
{@const chapter = chapterLookup.get(bookmark.chapterId)}
{#if chapter}
{@const bookmarkDate = new Date(bookmark.createdTime)}
<a
href="/novels/{novelId}/volumes/{chapter.volumeOrder}/chapters/{chapter.order}"
class="flex items-center justify-between px-3 py-2.5 hover:bg-muted/50 rounded-md transition-colors group"
>
<div class="flex-1 min-w-0">
<div class="flex items-center gap-3">
<span class="text-muted-foreground text-sm font-medium shrink-0 w-14">
Ch. {chapter.order}
</span>
<span class="text-sm truncate group-hover:text-primary transition-colors">
{chapter.name}
</span>
</div>
{#if bookmark.description}
<p class="text-xs text-muted-foreground/70 mt-1 ml-[4.25rem] truncate">
{bookmark.description}
</p>
{/if}
</div>
<span class="text-xs text-muted-foreground/70 shrink-0 ml-2">
{formatRelativeTime(bookmarkDate)}
</span>
</a>
{/if}
{/each}
</div>
{/if}
</TabsContent>
</CardContent>
</Tabs>
@@ -578,9 +842,9 @@
/>
<!-- Chapter link (if not cover) -->
{#if !currentImage.isCover && currentImage.chapterOrder}
{#if !currentImage.isCover && currentImage.volumeOrder != null && currentImage.chapterOrder}
<a
href="/novels/{novelId}/chapters/{currentImage.chapterOrder}"
href="/novels/{novelId}/volumes/{currentImage.volumeOrder}/chapters/{currentImage.chapterOrder}"
class="text-white/80 hover:text-white text-sm inline-flex items-center gap-1 mt-3"
>
From: Ch. {currentImage.chapterOrder} - {currentImage.chapterName}
@@ -595,3 +859,53 @@
</div>
</div>
{/if}
<!-- Delete Confirmation Modal -->
{#if showDeleteConfirm && novel}
<div
class="fixed inset-0 z-50 flex items-center justify-center bg-black/50 backdrop-blur-sm"
onclick={() => !deleting && (showDeleteConfirm = false)}
onkeydown={(e) => e.key === 'Escape' && !deleting && (showDeleteConfirm = false)}
role="dialog"
aria-modal="true"
aria-labelledby="delete-modal-title"
tabindex="-1"
>
<!-- svelte-ignore a11y_click_events_have_key_events -->
<!-- svelte-ignore a11y_no_static_element_interactions -->
<div onclick={(e: MouseEvent) => e.stopPropagation()}>
<Card class="w-full max-w-md mx-4 shadow-xl">
<CardHeader>
<h2 id="delete-modal-title" class="text-lg font-semibold">Delete Novel</h2>
</CardHeader>
<CardContent class="space-y-4">
<p class="text-muted-foreground">
Are you sure you want to delete <strong class="text-foreground">{novel.name}</strong>?
</p>
<p class="text-sm text-muted-foreground">
This will permanently delete the novel, all chapters, images, and translations. This action cannot be undone.
</p>
{#if deleteError}
<p class="text-sm text-destructive">{deleteError}</p>
{/if}
<div class="flex justify-end gap-2 pt-2">
<Button
variant="outline"
onclick={() => (showDeleteConfirm = false)}
disabled={deleting}
>
Cancel
</Button>
<Button
variant="destructive"
onclick={deleteNovel}
disabled={deleting}
>
{deleting ? 'Deleting...' : 'Delete'}
</Button>
</div>
</CardContent>
</Card>
</div>
</div>
{/if}

View File

@@ -8,7 +8,7 @@
import { Input } from '$lib/components/ui/input';
import { Button } from '$lib/components/ui/button';
import { Badge } from '$lib/components/ui/badge';
import { type NovelFilters, hasActiveFilters, EMPTY_FILTERS } from '$lib/utils/filterParams';
import { type NovelFilters, type SortField, type SortDirection, hasActiveFilters, EMPTY_FILTERS } from '$lib/utils/filterParams';
import { NovelStatus, type NovelTagDto } from '$lib/graphql/__generated__/graphql';
interface Props {
@@ -34,6 +34,19 @@
{ value: NovelStatus.Unknown, label: 'Unknown' }
];
// Sort options
const sortOptions: { value: `${SortField}-${SortDirection}`; label: string }[] = [
{ value: 'lastUpdatedTime-DESC', label: 'Recently Updated' },
{ value: 'lastUpdatedTime-ASC', label: 'Oldest Updated' },
{ value: 'createdTime-DESC', label: 'Recently Added' },
{ value: 'createdTime-ASC', label: 'Oldest Added' },
{ value: 'name-ASC', label: 'Name (A-Z)' },
{ value: 'name-DESC', label: 'Name (Z-A)' }
];
// Current sort value as combined string for the select
const currentSortValue = $derived(`${filters.sort.field}-${filters.sort.direction}` as const);
// Derived state for display
const selectedStatusLabels = $derived(
filters.statuses.map((s) => statusOptions.find((o) => o.value === s)?.label ?? s).join(', ')
@@ -71,6 +84,12 @@
onFilterChange({ ...filters, tags: selected });
}
// Sort selection handler
function handleSortChange(value: string) {
const [field, direction] = value.split('-') as [SortField, SortDirection];
onFilterChange({ ...filters, sort: { field, direction } });
}
// Clear all filters
function clearFilters() {
searchInput = '';
@@ -196,6 +215,41 @@
</Select.Root>
{/if}
<!-- Sort Dropdown -->
<Select.Root
type="single"
value={currentSortValue}
onValueChange={(v) => v && handleSortChange(v)}
>
<Select.Trigger
class="border-input bg-background ring-offset-background placeholder:text-muted-foreground focus:ring-ring flex h-9 min-w-[160px] items-center justify-between gap-2 rounded-md border px-3 py-2 text-sm shadow-sm focus:outline-none focus:ring-1 disabled:cursor-not-allowed disabled:opacity-50"
>
<span class="truncate text-left">
{sortOptions.find((o) => o.value === currentSortValue)?.label ?? 'Sort by'}
</span>
<ChevronDown class="h-4 w-4 opacity-50" />
</Select.Trigger>
<Select.Content
class="bg-popover text-popover-foreground z-50 max-h-60 min-w-[160px] overflow-auto rounded-md border p-1 shadow-md"
>
{#each sortOptions as option (option.value)}
<Select.Item
value={option.value}
class="hover:bg-accent hover:text-accent-foreground focus:bg-accent focus:text-accent-foreground relative flex cursor-pointer select-none items-center gap-2 rounded-sm px-2 py-1.5 text-sm outline-none data-[disabled]:pointer-events-none data-[disabled]:opacity-50"
>
{#snippet children({ selected })}
<div class="flex h-4 w-4 items-center justify-center">
{#if selected}
<Check class="h-3 w-3" />
{/if}
</div>
<span>{option.label}</span>
{/snippet}
</Select.Item>
{/each}
</Select.Content>
</Select.Root>
<!-- Clear Filters Button -->
{#if hasActiveFilters(filters)}
<Button variant="outline" size="sm" onclick={clearFilters} class="gap-1">

View File

@@ -14,6 +14,7 @@
parseFiltersFromURL,
syncFiltersToURL,
filtersToGraphQLWhere,
sortToGraphQLOrder,
hasActiveFilters,
EMPTY_FILTERS
} from '$lib/utils/filterParams';
@@ -52,8 +53,9 @@
try {
const where = filtersToGraphQLWhere(filters);
const order = sortToGraphQLOrder(filters.sort);
const result = await client
.query(NovelsDocument, { first: PAGE_SIZE, after, where })
.query(NovelsDocument, { first: PAGE_SIZE, after, where, order })
.toPromise();
if (result.error) {
@@ -116,20 +118,13 @@
<Card class="shadow-md shadow-primary/10">
<CardHeader>
<div class="flex items-center justify-between">
<CardTitle>Novels</CardTitle>
<CardTitle>Controls</CardTitle>
{#if $isAuthenticated}
<Button variant="outline" onclick={() => (showImportModal = true)}>
Import Novel
</Button>
{/if}
</div>
<p class="text-muted-foreground text-sm">
{#if hasActiveFilters(filters)}
Showing filtered results
{:else}
Browse all novels
{/if}
</p>
</CardHeader>
<CardContent>
<NovelFilters {filters} onFilterChange={handleFilterChange} availableTags={availableTags()} />

Some files were not shown because too many files have changed in this diff Show More