Compare commits

..

14 Commits

Author SHA1 Message Date
gamer147
9bc39c3abf [FA-misc] Reporting service seems to be working 2026-02-01 10:19:52 -05:00
gamer147
bdb863a032 [FA-misc] CICD Updates 2026-01-31 10:48:14 -05:00
gamer147
7c3df7ab11 [FA-misc] Add ReportingService consumer unit tests 2026-01-30 16:47:26 -05:00
gamer147
2e4e2c26aa [FA-misc] Add ReportingService Dockerfile and docker-compose entry
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-30 16:45:16 -05:00
gamer147
1057e1bcd4 [FA-misc] Add initial ReportingService migration
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-30 16:43:48 -05:00
gamer147
1fda5ad440 [FA-misc] Wire up ReportingService Program.cs
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-30 16:43:26 -05:00
gamer147
2c14ab4936 [FA-misc] Add GraphQL job queries with filtering and pagination 2026-01-30 16:40:16 -05:00
gamer147
433f038051 [FA-misc] Add JobStatusUpdateConsumer with upsert logic 2026-01-30 16:39:11 -05:00
gamer147
3c835d9cc3 [FA-misc] Add Job entity and ReportingDbContext 2026-01-30 16:38:46 -05:00
gamer147
9577aa996a [FA-misc] Scaffold ReportingService project
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-30 16:33:43 -05:00
gamer147
c25f59a4b4 [FA-misc] Add IJobStatusUpdate event contract and publishing helper 2026-01-30 16:32:01 -05:00
gamer147
be1ebbea39 [FA-misc] Add JobStatus enum 2026-01-30 16:31:42 -05:00
67521d6530 Merge pull request '[FA-misc] Fix issues with novel imports' (#63) from epic/FA-misc_MassTransit into master
All checks were successful
CI / build-backend (push) Successful in 1m6s
CI / build-frontend (push) Successful in 43s
Build Gateway / build-subgraphs (map[name:novel-service project:FictionArchive.Service.NovelService subgraph:Novel]) (push) Successful in 49s
Build Gateway / build-subgraphs (map[name:scheduler-service project:FictionArchive.Service.SchedulerService subgraph:Scheduler]) (push) Successful in 45s
Build Gateway / build-subgraphs (map[name:translation-service project:FictionArchive.Service.TranslationService subgraph:Translation]) (push) Successful in 47s
Build Gateway / build-subgraphs (map[name:user-service project:FictionArchive.Service.UserService subgraph:User]) (push) Successful in 51s
Build Gateway / build-subgraphs (map[name:usernoveldata-service project:FictionArchive.Service.UserNovelDataService subgraph:UserNovelData]) (push) Successful in 47s
Release / build-and-push (map[dockerfile:FictionArchive.Service.FileService/Dockerfile name:file-service]) (push) Successful in 2m3s
Release / build-and-push (map[dockerfile:FictionArchive.Service.NovelService/Dockerfile name:novel-service]) (push) Successful in 1m52s
Release / build-and-push (map[dockerfile:FictionArchive.Service.SchedulerService/Dockerfile name:scheduler-service]) (push) Successful in 1m44s
Release / build-and-push (map[dockerfile:FictionArchive.Service.TranslationService/Dockerfile name:translation-service]) (push) Successful in 1m46s
Release / build-and-push (map[dockerfile:FictionArchive.Service.UserNovelDataService/Dockerfile name:usernoveldata-service]) (push) Successful in 1m38s
Release / build-and-push (map[dockerfile:FictionArchive.Service.UserService/Dockerfile name:user-service]) (push) Successful in 1m40s
Release / build-frontend (push) Successful in 1m44s
Build Gateway / build-gateway (push) Successful in 3m29s
Reviewed-on: #63
2026-01-30 17:55:40 +00:00
3820cb3af9 Merge pull request 'epic/FA-misc_MassTransit' (#62) from epic/FA-misc_MassTransit into master
Some checks failed
CI / build-backend (push) Successful in 1m10s
CI / build-frontend (push) Successful in 45s
Build Gateway / build-subgraphs (map[name:novel-service project:FictionArchive.Service.NovelService subgraph:Novel]) (push) Successful in 48s
Build Gateway / build-subgraphs (map[name:scheduler-service project:FictionArchive.Service.SchedulerService subgraph:Scheduler]) (push) Successful in 44s
Build Gateway / build-subgraphs (map[name:translation-service project:FictionArchive.Service.TranslationService subgraph:Translation]) (push) Successful in 56s
Build Gateway / build-subgraphs (map[name:user-service project:FictionArchive.Service.UserService subgraph:User]) (push) Successful in 45s
Build Gateway / build-subgraphs (map[name:usernoveldata-service project:FictionArchive.Service.UserNovelDataService subgraph:UserNovelData]) (push) Successful in 51s
Release / build-and-push (map[dockerfile:FictionArchive.Service.AuthenticationService/Dockerfile name:authentication-service]) (push) Failing after 29s
Release / build-and-push (map[dockerfile:FictionArchive.Service.FileService/Dockerfile name:file-service]) (push) Successful in 2m5s
Release / build-and-push (map[dockerfile:FictionArchive.Service.NovelService/Dockerfile name:novel-service]) (push) Successful in 1m59s
Release / build-and-push (map[dockerfile:FictionArchive.Service.SchedulerService/Dockerfile name:scheduler-service]) (push) Successful in 1m47s
Release / build-and-push (map[dockerfile:FictionArchive.Service.TranslationService/Dockerfile name:translation-service]) (push) Successful in 2m15s
Release / build-and-push (map[dockerfile:FictionArchive.Service.UserNovelDataService/Dockerfile name:usernoveldata-service]) (push) Successful in 1m46s
Release / build-and-push (map[dockerfile:FictionArchive.Service.UserService/Dockerfile name:user-service]) (push) Successful in 1m46s
Release / build-frontend (push) Successful in 2m22s
Build Gateway / build-gateway (push) Successful in 3m39s
Reviewed-on: #62
2026-01-29 14:59:19 +00:00
28 changed files with 1029 additions and 6 deletions

View File

@@ -31,6 +31,9 @@ jobs:
- name: usernoveldata-service - name: usernoveldata-service
project: FictionArchive.Service.UserNovelDataService project: FictionArchive.Service.UserNovelDataService
subgraph: UserNovelData subgraph: UserNovelData
- name: reporting-service
project: FictionArchive.Service.ReportingService
subgraph: Reporting
steps: steps:
- name: Checkout - name: Checkout
uses: actions/checkout@v4 uses: actions/checkout@v4
@@ -119,6 +122,12 @@ jobs:
name: usernoveldata-service-subgraph name: usernoveldata-service-subgraph
path: subgraphs/usernoveldata path: subgraphs/usernoveldata
- name: Download Reporting Service subgraph
uses: christopherhx/gitea-download-artifact@v4
with:
name: reporting-service-subgraph
path: subgraphs/reporting
- name: Configure subgraph URLs for Docker - name: Configure subgraph URLs for Docker
run: | run: |
for fsp in subgraphs/*/*.fsp; do for fsp in subgraphs/*/*.fsp; do

View File

@@ -27,6 +27,8 @@ jobs:
dockerfile: FictionArchive.Service.SchedulerService/Dockerfile dockerfile: FictionArchive.Service.SchedulerService/Dockerfile
- name: usernoveldata-service - name: usernoveldata-service
dockerfile: FictionArchive.Service.UserNovelDataService/Dockerfile dockerfile: FictionArchive.Service.UserNovelDataService/Dockerfile
- name: reporting-service
dockerfile: FictionArchive.Service.ReportingService/Dockerfile
steps: steps:
- name: Checkout - name: Checkout
uses: actions/checkout@v4 uses: actions/checkout@v4

View File

@@ -0,0 +1,9 @@
namespace FictionArchive.Common.Enums;
public enum JobStatus
{
Failed = -1,
Pending = 0,
InProgress = 1,
Completed = 2
}

View File

@@ -3,6 +3,7 @@ using Amazon.S3.Model;
using FictionArchive.Common.Enums; using FictionArchive.Common.Enums;
using FictionArchive.Service.FileService.Models; using FictionArchive.Service.FileService.Models;
using FictionArchive.Service.Shared.Contracts.Events; using FictionArchive.Service.Shared.Contracts.Events;
using FictionArchive.Service.Shared.Extensions;
using MassTransit; using MassTransit;
using Microsoft.Extensions.Logging; using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options; using Microsoft.Extensions.Options;
@@ -35,6 +36,10 @@ public class FileUploadRequestCreatedConsumer : IConsumer<IFileUploadRequestCrea
{ {
var message = context.Message; var message = context.Message;
await _publishEndpoint.ReportJobStatus(
message.RequestId, "FileUpload", $"Upload {message.FilePath}",
JobStatus.InProgress, parentJobId: message.ImportId);
var putObjectRequest = new PutObjectRequest var putObjectRequest = new PutObjectRequest
{ {
BucketName = _s3Configuration.Bucket, BucketName = _s3Configuration.Bucket,
@@ -58,6 +63,11 @@ public class FileUploadRequestCreatedConsumer : IConsumer<IFileUploadRequestCrea
Status: RequestStatus.Failed, Status: RequestStatus.Failed,
FileAccessUrl: null, FileAccessUrl: null,
ErrorMessage: "An error occurred while uploading file to S3.")); ErrorMessage: "An error occurred while uploading file to S3."));
await _publishEndpoint.ReportJobStatus(
message.RequestId, "FileUpload", $"Upload {message.FilePath}",
JobStatus.Failed, parentJobId: message.ImportId,
errorMessage: "An error occurred while uploading file to S3.");
return; return;
} }
@@ -72,5 +82,10 @@ public class FileUploadRequestCreatedConsumer : IConsumer<IFileUploadRequestCrea
Status: RequestStatus.Success, Status: RequestStatus.Success,
FileAccessUrl: fileAccessUrl, FileAccessUrl: fileAccessUrl,
ErrorMessage: null)); ErrorMessage: null));
await _publishEndpoint.ReportJobStatus(
message.RequestId, "FileUpload", $"Upload {message.FilePath}",
JobStatus.Completed, parentJobId: message.ImportId,
metadata: new Dictionary<string, string> { ["FileAccessUrl"] = fileAccessUrl });
} }
} }

View File

@@ -27,6 +27,11 @@ public class NovelImportSagaTests
var sagaHarness = harness.GetSagaStateMachineHarness<NovelImportSaga, NovelImportSagaState>(); var sagaHarness = harness.GetSagaStateMachineHarness<NovelImportSaga, NovelImportSagaState>();
(await sagaHarness.Exists(importId, x => x.Importing)).HasValue.Should().BeTrue(); (await sagaHarness.Exists(importId, x => x.Importing)).HasValue.Should().BeTrue();
(await harness.Published.Any<IJobStatusUpdate>(x =>
x.Context.Message.JobId == importId &&
x.Context.Message.Status == JobStatus.InProgress &&
x.Context.Message.JobType == "NovelImport")).Should().BeTrue();
} }
[Fact] [Fact]
@@ -45,6 +50,11 @@ public class NovelImportSagaTests
(await harness.Published.Any<INovelImportCompleted>(x => (await harness.Published.Any<INovelImportCompleted>(x =>
x.Context.Message.ImportId == importId && x.Context.Message.Success)).Should().BeTrue(); x.Context.Message.ImportId == importId && x.Context.Message.Success)).Should().BeTrue();
(await harness.Published.Any<IJobStatusUpdate>(x =>
x.Context.Message.JobId == importId &&
x.Context.Message.Status == JobStatus.Completed &&
x.Context.Message.JobType == "NovelImport")).Should().BeTrue();
} }
[Fact] [Fact]
@@ -79,6 +89,11 @@ public class NovelImportSagaTests
var sagaHarness = harness.GetSagaStateMachineHarness<NovelImportSaga, NovelImportSagaState>(); var sagaHarness = harness.GetSagaStateMachineHarness<NovelImportSaga, NovelImportSagaState>();
(await sagaHarness.Exists(importId, x => x.Completed)).HasValue.Should().BeTrue(); (await sagaHarness.Exists(importId, x => x.Completed)).HasValue.Should().BeTrue();
(await harness.Published.Any<IJobStatusUpdate>(x =>
x.Context.Message.JobId == importId &&
x.Context.Message.Status == JobStatus.Completed &&
x.Context.Message.JobType == "NovelImport")).Should().BeTrue();
} }
[Fact] [Fact]
@@ -121,6 +136,48 @@ public class NovelImportSagaTests
(await harness.Published.Any<INovelImportCompleted>(x => (await harness.Published.Any<INovelImportCompleted>(x =>
x.Context.Message.ImportId == importId && x.Context.Message.Success)).Should().BeTrue(); x.Context.Message.ImportId == importId && x.Context.Message.Success)).Should().BeTrue();
(await harness.Published.Any<IJobStatusUpdate>(x =>
x.Context.Message.JobId == importId &&
x.Context.Message.Status == JobStatus.Completed &&
x.Context.Message.JobType == "NovelImport")).Should().BeTrue();
}
[Fact]
public async Task Should_publish_failed_job_status_on_chapter_pull_fault()
{
await using var provider = CreateTestProvider();
var harness = provider.GetRequiredService<ITestHarness>();
await harness.Start();
var importId = Guid.NewGuid();
await harness.Bus.Publish<INovelImportRequested>(new NovelImportRequested(importId, "https://example.com/novel"));
await harness.Bus.Publish<INovelMetadataImported>(new NovelMetadataImported(importId, 1, 1, false));
var sagaHarness = harness.GetSagaStateMachineHarness<NovelImportSaga, NovelImportSagaState>();
(await sagaHarness.Exists(importId, x => x.Processing)).HasValue.Should().BeTrue();
await harness.Bus.Publish<Fault<IChapterPullRequested>>(new
{
Message = new ChapterPullRequested(importId, 1, 1, 1),
Exceptions = new[]
{
new
{
ExceptionType = typeof(Exception).FullName!,
Message = "Chapter pull failed",
StackTrace = "stack trace",
InnerException = (object?)null
}
}
});
(await sagaHarness.Exists(importId, x => x.Failed)).HasValue.Should().BeTrue();
(await harness.Published.Any<IJobStatusUpdate>(x =>
x.Context.Message.JobId == importId &&
x.Context.Message.Status == JobStatus.Failed &&
x.Context.Message.JobType == "NovelImport")).Should().BeTrue();
} }
private ServiceProvider CreateTestProvider() private ServiceProvider CreateTestProvider()

View File

@@ -1,5 +1,7 @@
using FictionArchive.Common.Enums;
using FictionArchive.Service.NovelService.Services; using FictionArchive.Service.NovelService.Services;
using FictionArchive.Service.Shared.Contracts.Events; using FictionArchive.Service.Shared.Contracts.Events;
using FictionArchive.Service.Shared.Extensions;
using MassTransit; using MassTransit;
using Microsoft.Extensions.Logging; using Microsoft.Extensions.Logging;
@@ -21,6 +23,11 @@ public class ChapterPullRequestedConsumer : IConsumer<IChapterPullRequested>
public async Task Consume(ConsumeContext<IChapterPullRequested> context) public async Task Consume(ConsumeContext<IChapterPullRequested> context)
{ {
var message = context.Message; var message = context.Message;
var chapterJobId = Guid.NewGuid();
await context.ReportJobStatus(
chapterJobId, "ChapterPull", $"Pull chapter {message.ChapterOrder}",
JobStatus.InProgress, parentJobId: message.ImportId);
var (chapter, imageCount) = await _novelUpdateService.PullChapterContents( var (chapter, imageCount) = await _novelUpdateService.PullChapterContents(
message.ImportId, message.ImportId,
@@ -33,5 +40,10 @@ public class ChapterPullRequestedConsumer : IConsumer<IChapterPullRequested>
chapter.Id, chapter.Id,
imageCount imageCount
)); ));
await context.ReportJobStatus(
chapterJobId, "ChapterPull", $"Pull chapter {message.ChapterOrder}",
JobStatus.Completed, parentJobId: message.ImportId,
metadata: new Dictionary<string, string> { ["ChapterId"] = chapter.Id.ToString() });
} }
} }

View File

@@ -1,3 +1,4 @@
using FictionArchive.Common.Enums;
using FictionArchive.Service.Shared.Contracts.Events; using FictionArchive.Service.Shared.Contracts.Events;
using MassTransit; using MassTransit;
using NodaTime; using NodaTime;
@@ -49,6 +50,10 @@ public class NovelImportSaga : MassTransitStateMachine<NovelImportSagaState>
ctx.Saga.StartedAt = _clock.GetCurrentInstant(); ctx.Saga.StartedAt = _clock.GetCurrentInstant();
}) })
.TransitionTo(Importing) .TransitionTo(Importing)
.PublishAsync(ctx => ctx.Init<IJobStatusUpdate>(new JobStatusUpdate(
ctx.Saga.CorrelationId, null, "NovelImport",
$"Import {ctx.Saga.NovelUrl}", JobStatus.InProgress,
null, new Dictionary<string, string> { ["NovelUrl"] = ctx.Saga.NovelUrl })))
); );
During(Importing, During(Importing,
@@ -68,7 +73,11 @@ public class NovelImportSaga : MassTransitStateMachine<NovelImportSagaState>
ctx.Saga.CorrelationId, ctx.Saga.CorrelationId,
ctx.Saga.NovelId, ctx.Saga.NovelId,
true, true,
null))), null)))
.PublishAsync(ctx => ctx.Init<IJobStatusUpdate>(new JobStatusUpdate(
ctx.Saga.CorrelationId, null, "NovelImport",
$"Import {ctx.Saga.NovelUrl}", JobStatus.Completed,
null, new Dictionary<string, string> { ["NovelId"] = ctx.Saga.NovelId.ToString() }))),
elseBinder => elseBinder.TransitionTo(Processing) elseBinder => elseBinder.TransitionTo(Processing)
) )
); );
@@ -87,7 +96,11 @@ public class NovelImportSaga : MassTransitStateMachine<NovelImportSagaState>
c.Saga.CorrelationId, c.Saga.CorrelationId,
c.Saga.NovelId, c.Saga.NovelId,
true, true,
null)))), null)))
.PublishAsync(c => c.Init<IJobStatusUpdate>(new JobStatusUpdate(
c.Saga.CorrelationId, null, "NovelImport",
$"Import {c.Saga.NovelUrl}", JobStatus.Completed,
null, new Dictionary<string, string> { ["NovelId"] = c.Saga.NovelId.ToString() })))),
When(FileUploadStatusUpdate) When(FileUploadStatusUpdate)
.Then(ctx => ctx.Saga.CompletedImages++) .Then(ctx => ctx.Saga.CompletedImages++)
@@ -98,7 +111,11 @@ public class NovelImportSaga : MassTransitStateMachine<NovelImportSagaState>
c.Saga.CorrelationId, c.Saga.CorrelationId,
c.Saga.NovelId, c.Saga.NovelId,
true, true,
null)))), null)))
.PublishAsync(c => c.Init<IJobStatusUpdate>(new JobStatusUpdate(
c.Saga.CorrelationId, null, "NovelImport",
$"Import {c.Saga.NovelUrl}", JobStatus.Completed,
null, new Dictionary<string, string> { ["NovelId"] = c.Saga.NovelId.ToString() })))),
When(ChapterPullFaulted) When(ChapterPullFaulted)
.Then(ctx => .Then(ctx =>
@@ -111,7 +128,11 @@ public class NovelImportSaga : MassTransitStateMachine<NovelImportSagaState>
ctx.Saga.CorrelationId, ctx.Saga.CorrelationId,
ctx.Saga.NovelId, ctx.Saga.NovelId,
false, false,
ctx.Saga.ErrorMessage))), ctx.Saga.ErrorMessage)))
.PublishAsync(ctx => ctx.Init<IJobStatusUpdate>(new JobStatusUpdate(
ctx.Saga.CorrelationId, null, "NovelImport",
$"Import {ctx.Saga.NovelUrl}", JobStatus.Failed,
ctx.Saga.ErrorMessage, null))),
When(FileUploadFaulted) When(FileUploadFaulted)
.Then(ctx => .Then(ctx =>
@@ -125,6 +146,10 @@ public class NovelImportSaga : MassTransitStateMachine<NovelImportSagaState>
ctx.Saga.NovelId, ctx.Saga.NovelId,
false, false,
ctx.Saga.ErrorMessage))) ctx.Saga.ErrorMessage)))
.PublishAsync(ctx => ctx.Init<IJobStatusUpdate>(new JobStatusUpdate(
ctx.Saga.CorrelationId, null, "NovelImport",
$"Import {ctx.Saga.NovelUrl}", JobStatus.Failed,
ctx.Saga.ErrorMessage, null)))
); );
SetCompletedWhenFinalized(); SetCompletedWhenFinalized();

View File

@@ -0,0 +1,233 @@
using System.Text.Json;
using FictionArchive.Common.Enums;
using FictionArchive.Service.ReportingService.Consumers;
using FictionArchive.Service.ReportingService.Models;
using FictionArchive.Service.ReportingService.Services;
using FictionArchive.Service.Shared.Contracts.Events;
using FluentAssertions;
using MassTransit;
using Microsoft.EntityFrameworkCore;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Logging.Abstractions;
using NSubstitute;
using Xunit;
namespace FictionArchive.Service.ReportingService.Tests.Consumers;
public class JobStatusUpdateConsumerTests : IDisposable
{
private readonly ReportingDbContext _dbContext;
private readonly JobStatusUpdateConsumer _consumer;
public JobStatusUpdateConsumerTests()
{
var options = new DbContextOptionsBuilder<ReportingDbContext>()
.UseInMemoryDatabase(databaseName: Guid.NewGuid().ToString())
.Options;
_dbContext = new TestReportingDbContext(options, NullLogger<ReportingDbContext>.Instance);
_consumer = new JobStatusUpdateConsumer(
NullLogger<JobStatusUpdateConsumer>.Instance,
_dbContext);
}
[Fact]
public async Task Should_create_new_job_on_first_event()
{
var jobId = Guid.NewGuid();
var context = CreateConsumeContext(new JobStatusUpdate(
jobId, null, "TestJob", "Test job display",
JobStatus.InProgress, null, new() { ["key1"] = "value1" }));
await _consumer.Consume(context);
var job = await _dbContext.Jobs.FindAsync(jobId);
job.Should().NotBeNull();
job!.JobType.Should().Be("TestJob");
job.DisplayName.Should().Be("Test job display");
job.Status.Should().Be(JobStatus.InProgress);
job.Metadata.Should().ContainKey("key1").WhoseValue.Should().Be("value1");
}
[Fact]
public async Task Should_update_status_on_subsequent_event()
{
var jobId = Guid.NewGuid();
// First event: create
await _consumer.Consume(CreateConsumeContext(new JobStatusUpdate(
jobId, null, "TestJob", "Test job",
JobStatus.InProgress, null, null)));
// Second event: update
await _consumer.Consume(CreateConsumeContext(new JobStatusUpdate(
jobId, null, "TestJob", "Test job",
JobStatus.Completed, null, null)));
var job = await _dbContext.Jobs.FindAsync(jobId);
job!.Status.Should().Be(JobStatus.Completed);
}
[Fact]
public async Task Should_merge_metadata_on_update()
{
var jobId = Guid.NewGuid();
// First event with initial metadata
await _consumer.Consume(CreateConsumeContext(new JobStatusUpdate(
jobId, null, "TestJob", "Test job",
JobStatus.InProgress, null, new() { ["NovelId"] = "42" })));
// Second event with additional metadata
await _consumer.Consume(CreateConsumeContext(new JobStatusUpdate(
jobId, null, "TestJob", "Test job",
JobStatus.Completed, null, new() { ["ChapterId"] = "7" })));
var job = await _dbContext.Jobs.FindAsync(jobId);
job!.Metadata.Should().ContainKey("NovelId").WhoseValue.Should().Be("42");
job.Metadata.Should().ContainKey("ChapterId").WhoseValue.Should().Be("7");
}
[Fact]
public async Task Should_not_overwrite_job_type_on_update()
{
var jobId = Guid.NewGuid();
await _consumer.Consume(CreateConsumeContext(new JobStatusUpdate(
jobId, null, "OriginalType", "Test job",
JobStatus.InProgress, null, null)));
await _consumer.Consume(CreateConsumeContext(new JobStatusUpdate(
jobId, null, "DifferentType", "Test job",
JobStatus.Completed, null, null)));
var job = await _dbContext.Jobs.FindAsync(jobId);
job!.JobType.Should().Be("OriginalType");
}
[Fact]
public async Task Should_not_overwrite_parent_job_id_on_update()
{
var jobId = Guid.NewGuid();
var parentId = Guid.NewGuid();
await _consumer.Consume(CreateConsumeContext(new JobStatusUpdate(
jobId, parentId, "TestJob", "Test job",
JobStatus.InProgress, null, null)));
await _consumer.Consume(CreateConsumeContext(new JobStatusUpdate(
jobId, null, "TestJob", "Test job",
JobStatus.Completed, null, null)));
var job = await _dbContext.Jobs.FindAsync(jobId);
job!.ParentJobId.Should().Be(parentId);
}
[Fact]
public async Task Should_set_error_message_on_failure()
{
var jobId = Guid.NewGuid();
await _consumer.Consume(CreateConsumeContext(new JobStatusUpdate(
jobId, null, "TestJob", "Test job",
JobStatus.Failed, "Something went wrong", null)));
var job = await _dbContext.Jobs.FindAsync(jobId);
job!.Status.Should().Be(JobStatus.Failed);
job.ErrorMessage.Should().Be("Something went wrong");
}
[Fact]
public async Task Should_store_parent_job_id()
{
var parentId = Guid.NewGuid();
var childId = Guid.NewGuid();
await _consumer.Consume(CreateConsumeContext(new JobStatusUpdate(
parentId, null, "ParentJob", "Parent",
JobStatus.InProgress, null, null)));
await _consumer.Consume(CreateConsumeContext(new JobStatusUpdate(
childId, parentId, "ChildJob", "Child",
JobStatus.InProgress, null, null)));
var child = await _dbContext.Jobs.FindAsync(childId);
child!.ParentJobId.Should().Be(parentId);
}
[Fact]
public async Task Should_handle_null_metadata_on_create()
{
var jobId = Guid.NewGuid();
await _consumer.Consume(CreateConsumeContext(new JobStatusUpdate(
jobId, null, "TestJob", "Test job",
JobStatus.InProgress, null, null)));
var job = await _dbContext.Jobs.FindAsync(jobId);
job!.Metadata.Should().BeNull();
}
[Fact]
public async Task Should_add_metadata_to_job_with_null_metadata()
{
var jobId = Guid.NewGuid();
// First event: no metadata
await _consumer.Consume(CreateConsumeContext(new JobStatusUpdate(
jobId, null, "TestJob", "Test job",
JobStatus.InProgress, null, null)));
// Second event: adds metadata
await _consumer.Consume(CreateConsumeContext(new JobStatusUpdate(
jobId, null, "TestJob", "Test job",
JobStatus.Completed, null, new() { ["result"] = "success" })));
var job = await _dbContext.Jobs.FindAsync(jobId);
job!.Metadata.Should().ContainKey("result").WhoseValue.Should().Be("success");
}
private static ConsumeContext<IJobStatusUpdate> CreateConsumeContext(JobStatusUpdate message)
{
var context = Substitute.For<ConsumeContext<IJobStatusUpdate>>();
context.Message.Returns(message);
return context;
}
public void Dispose()
{
_dbContext.Dispose();
}
/// <summary>
/// Test-specific subclass that adds a JSON value converter for Dictionary properties,
/// since the InMemory provider does not support the jsonb column type used in production.
/// </summary>
private class TestReportingDbContext : ReportingDbContext
{
public TestReportingDbContext(DbContextOptions options, ILogger<ReportingDbContext> logger)
: base(options, logger)
{
}
protected override void OnModelCreating(ModelBuilder modelBuilder)
{
base.OnModelCreating(modelBuilder);
modelBuilder.Entity<Job>(entity =>
{
entity.Property(j => j.Metadata)
.HasConversion(
v => v == null ? null : JsonSerializer.Serialize(v, (JsonSerializerOptions?)null),
v => v == null ? null : JsonSerializer.Deserialize<Dictionary<string, string>>(v, (JsonSerializerOptions?)null))
.HasColumnType(null!);
});
}
protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)
{
// Skip base OnConfiguring to avoid adding AuditInterceptor
// which is not needed for unit tests
}
}
}

View File

@@ -0,0 +1,32 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net8.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
<IsPackable>false</IsPackable>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="FluentAssertions" Version="6.12.0" />
<PackageReference Include="MassTransit" Version="8.5.7" />
<PackageReference Include="Microsoft.EntityFrameworkCore.InMemory" Version="9.0.11" />
<PackageReference Include="Microsoft.NET.Test.Sdk" Version="17.11.1" />
<PackageReference Include="NodaTime.Testing" Version="3.3.0" />
<PackageReference Include="NSubstitute" Version="5.1.0" />
<PackageReference Include="xunit" Version="2.9.2" />
<PackageReference Include="xunit.runner.visualstudio" Version="2.8.2">
<PrivateAssets>all</PrivateAssets>
<IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
</PackageReference>
<PackageReference Include="coverlet.collector" Version="6.0.2">
<PrivateAssets>all</PrivateAssets>
<IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
</PackageReference>
</ItemGroup>
<ItemGroup>
<ProjectReference Include="..\FictionArchive.Service.ReportingService\FictionArchive.Service.ReportingService.csproj" />
</ItemGroup>
</Project>

View File

@@ -0,0 +1,66 @@
using FictionArchive.Service.ReportingService.Models;
using FictionArchive.Service.ReportingService.Services;
using FictionArchive.Service.Shared.Contracts.Events;
using MassTransit;
using Microsoft.EntityFrameworkCore;
namespace FictionArchive.Service.ReportingService.Consumers;
public class JobStatusUpdateConsumer : IConsumer<IJobStatusUpdate>
{
private readonly ILogger<JobStatusUpdateConsumer> _logger;
private readonly ReportingDbContext _dbContext;
public JobStatusUpdateConsumer(
ILogger<JobStatusUpdateConsumer> logger,
ReportingDbContext dbContext)
{
_logger = logger;
_dbContext = dbContext;
}
public async Task Consume(ConsumeContext<IJobStatusUpdate> context)
{
var message = context.Message;
var existingJob = await _dbContext.Jobs.FirstOrDefaultAsync(j => j.Id == message.JobId);
if (existingJob == null)
{
var job = new Job
{
Id = message.JobId,
ParentJobId = message.ParentJobId,
JobType = message.JobType,
DisplayName = message.DisplayName,
Status = message.Status,
ErrorMessage = message.ErrorMessage,
Metadata = message.Metadata != null
? new Dictionary<string, string>(message.Metadata)
: null
};
_dbContext.Jobs.Add(job);
_logger.LogInformation("Created job {JobId} of type {JobType}", message.JobId, message.JobType);
}
else
{
existingJob.Status = message.Status;
existingJob.DisplayName = message.DisplayName;
existingJob.ErrorMessage = message.ErrorMessage;
if (message.Metadata != null)
{
existingJob.Metadata ??= new Dictionary<string, string>();
foreach (var kvp in message.Metadata)
{
existingJob.Metadata[kvp.Key] = kvp.Value;
}
}
_logger.LogInformation("Updated job {JobId} to status {Status}", message.JobId, message.Status);
}
await _dbContext.SaveChangesAsync();
}
}

View File

@@ -0,0 +1,23 @@
FROM mcr.microsoft.com/dotnet/aspnet:8.0 AS base
USER $APP_UID
WORKDIR /app
EXPOSE 8080
EXPOSE 8081
FROM mcr.microsoft.com/dotnet/sdk:8.0 AS build
ARG BUILD_CONFIGURATION=Release
WORKDIR /src
COPY ["FictionArchive.Service.ReportingService/FictionArchive.Service.ReportingService.csproj", "FictionArchive.Service.ReportingService/"]
RUN dotnet restore "FictionArchive.Service.ReportingService/FictionArchive.Service.ReportingService.csproj"
COPY . .
WORKDIR "/src/FictionArchive.Service.ReportingService"
RUN dotnet build "./FictionArchive.Service.ReportingService.csproj" -c $BUILD_CONFIGURATION -o /app/build
FROM build AS publish
ARG BUILD_CONFIGURATION=Release
RUN dotnet publish "./FictionArchive.Service.ReportingService.csproj" -c $BUILD_CONFIGURATION -o /app/publish /p:UseAppHost=false
FROM base AS final
WORKDIR /app
COPY --from=publish /app/publish .
ENTRYPOINT ["dotnet", "FictionArchive.Service.ReportingService.dll"]

View File

@@ -0,0 +1,29 @@
<Project Sdk="Microsoft.NET.Sdk.Web">
<PropertyGroup>
<TargetFramework>net8.0</TargetFramework>
<Nullable>enable</Nullable>
<ImplicitUsings>enable</ImplicitUsings>
<DockerDefaultTargetOS>Linux</DockerDefaultTargetOS>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="HotChocolate.AspNetCore" Version="15.1.11" />
<PackageReference Include="HotChocolate.Data.EntityFramework" Version="15.1.11" />
<PackageReference Include="Microsoft.EntityFrameworkCore.Design" Version="9.0.11">
<PrivateAssets>all</PrivateAssets>
<IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
</PackageReference>
</ItemGroup>
<ItemGroup>
<Content Include="..\.dockerignore">
<Link>.dockerignore</Link>
</Content>
</ItemGroup>
<ItemGroup>
<ProjectReference Include="..\FictionArchive.Service.Shared\FictionArchive.Service.Shared.csproj" />
</ItemGroup>
</Project>

View File

@@ -0,0 +1,26 @@
using FictionArchive.Service.ReportingService.Models;
using FictionArchive.Service.ReportingService.Services;
using HotChocolate.Authorization;
using HotChocolate.Data;
namespace FictionArchive.Service.ReportingService.GraphQL;
[QueryType]
public static class JobQueries
{
[UseProjection]
[Authorize]
[UseFirstOrDefault]
public static IQueryable<Job> GetJobById(
Guid jobId,
ReportingDbContext db)
=> db.Jobs.Where(j => j.Id == jobId);
[UsePaging]
[UseProjection]
[UseFiltering]
[UseSorting]
[Authorize]
public static IQueryable<Job> GetJobs(ReportingDbContext db)
=> db.Jobs;
}

View File

@@ -0,0 +1,86 @@
// <auto-generated />
using System;
using System.Collections.Generic;
using FictionArchive.Service.ReportingService.Services;
using Microsoft.EntityFrameworkCore;
using Microsoft.EntityFrameworkCore.Infrastructure;
using Microsoft.EntityFrameworkCore.Migrations;
using Microsoft.EntityFrameworkCore.Storage.ValueConversion;
using NodaTime;
using Npgsql.EntityFrameworkCore.PostgreSQL.Metadata;
#nullable disable
namespace FictionArchive.Service.ReportingService.Migrations
{
[DbContext(typeof(ReportingDbContext))]
[Migration("20260130214338_InitialCreate")]
partial class InitialCreate
{
/// <inheritdoc />
protected override void BuildTargetModel(ModelBuilder modelBuilder)
{
#pragma warning disable 612, 618
modelBuilder
.HasAnnotation("ProductVersion", "9.0.11")
.HasAnnotation("Relational:MaxIdentifierLength", 63);
NpgsqlModelBuilderExtensions.UseIdentityByDefaultColumns(modelBuilder);
modelBuilder.Entity("FictionArchive.Service.ReportingService.Models.Job", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uuid");
b.Property<Instant>("CreatedTime")
.HasColumnType("timestamp with time zone");
b.Property<string>("DisplayName")
.IsRequired()
.HasColumnType("text");
b.Property<string>("ErrorMessage")
.HasColumnType("text");
b.Property<string>("JobType")
.IsRequired()
.HasColumnType("text");
b.Property<Instant>("LastUpdatedTime")
.HasColumnType("timestamp with time zone");
b.Property<Dictionary<string, string>>("Metadata")
.HasColumnType("jsonb");
b.Property<Guid?>("ParentJobId")
.HasColumnType("uuid");
b.Property<int>("Status")
.HasColumnType("integer");
b.HasKey("Id");
b.HasIndex("ParentJobId");
b.ToTable("Jobs");
});
modelBuilder.Entity("FictionArchive.Service.ReportingService.Models.Job", b =>
{
b.HasOne("FictionArchive.Service.ReportingService.Models.Job", "ParentJob")
.WithMany("ChildJobs")
.HasForeignKey("ParentJobId")
.OnDelete(DeleteBehavior.SetNull);
b.Navigation("ParentJob");
});
modelBuilder.Entity("FictionArchive.Service.ReportingService.Models.Job", b =>
{
b.Navigation("ChildJobs");
});
#pragma warning restore 612, 618
}
}
}

View File

@@ -0,0 +1,54 @@
using System;
using System.Collections.Generic;
using Microsoft.EntityFrameworkCore.Migrations;
using NodaTime;
#nullable disable
namespace FictionArchive.Service.ReportingService.Migrations
{
/// <inheritdoc />
public partial class InitialCreate : Migration
{
/// <inheritdoc />
protected override void Up(MigrationBuilder migrationBuilder)
{
migrationBuilder.CreateTable(
name: "Jobs",
columns: table => new
{
Id = table.Column<Guid>(type: "uuid", nullable: false),
ParentJobId = table.Column<Guid>(type: "uuid", nullable: true),
JobType = table.Column<string>(type: "text", nullable: false),
DisplayName = table.Column<string>(type: "text", nullable: false),
Status = table.Column<int>(type: "integer", nullable: false),
ErrorMessage = table.Column<string>(type: "text", nullable: true),
Metadata = table.Column<Dictionary<string, string>>(type: "jsonb", nullable: true),
CreatedTime = table.Column<Instant>(type: "timestamp with time zone", nullable: false),
LastUpdatedTime = table.Column<Instant>(type: "timestamp with time zone", nullable: false)
},
constraints: table =>
{
table.PrimaryKey("PK_Jobs", x => x.Id);
table.ForeignKey(
name: "FK_Jobs_Jobs_ParentJobId",
column: x => x.ParentJobId,
principalTable: "Jobs",
principalColumn: "Id",
onDelete: ReferentialAction.SetNull);
});
migrationBuilder.CreateIndex(
name: "IX_Jobs_ParentJobId",
table: "Jobs",
column: "ParentJobId");
}
/// <inheritdoc />
protected override void Down(MigrationBuilder migrationBuilder)
{
migrationBuilder.DropTable(
name: "Jobs");
}
}
}

View File

@@ -0,0 +1,83 @@
// <auto-generated />
using System;
using System.Collections.Generic;
using FictionArchive.Service.ReportingService.Services;
using Microsoft.EntityFrameworkCore;
using Microsoft.EntityFrameworkCore.Infrastructure;
using Microsoft.EntityFrameworkCore.Storage.ValueConversion;
using NodaTime;
using Npgsql.EntityFrameworkCore.PostgreSQL.Metadata;
#nullable disable
namespace FictionArchive.Service.ReportingService.Migrations
{
[DbContext(typeof(ReportingDbContext))]
partial class ReportingDbContextModelSnapshot : ModelSnapshot
{
protected override void BuildModel(ModelBuilder modelBuilder)
{
#pragma warning disable 612, 618
modelBuilder
.HasAnnotation("ProductVersion", "9.0.11")
.HasAnnotation("Relational:MaxIdentifierLength", 63);
NpgsqlModelBuilderExtensions.UseIdentityByDefaultColumns(modelBuilder);
modelBuilder.Entity("FictionArchive.Service.ReportingService.Models.Job", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uuid");
b.Property<Instant>("CreatedTime")
.HasColumnType("timestamp with time zone");
b.Property<string>("DisplayName")
.IsRequired()
.HasColumnType("text");
b.Property<string>("ErrorMessage")
.HasColumnType("text");
b.Property<string>("JobType")
.IsRequired()
.HasColumnType("text");
b.Property<Instant>("LastUpdatedTime")
.HasColumnType("timestamp with time zone");
b.Property<Dictionary<string, string>>("Metadata")
.HasColumnType("jsonb");
b.Property<Guid?>("ParentJobId")
.HasColumnType("uuid");
b.Property<int>("Status")
.HasColumnType("integer");
b.HasKey("Id");
b.HasIndex("ParentJobId");
b.ToTable("Jobs");
});
modelBuilder.Entity("FictionArchive.Service.ReportingService.Models.Job", b =>
{
b.HasOne("FictionArchive.Service.ReportingService.Models.Job", "ParentJob")
.WithMany("ChildJobs")
.HasForeignKey("ParentJobId")
.OnDelete(DeleteBehavior.SetNull);
b.Navigation("ParentJob");
});
modelBuilder.Entity("FictionArchive.Service.ReportingService.Models.Job", b =>
{
b.Navigation("ChildJobs");
});
#pragma warning restore 612, 618
}
}
}

View File

@@ -0,0 +1,18 @@
using FictionArchive.Common.Enums;
using FictionArchive.Service.Shared.Models;
namespace FictionArchive.Service.ReportingService.Models;
public class Job : BaseEntity<Guid>
{
public Guid? ParentJobId { get; set; }
public string JobType { get; set; } = null!;
public string DisplayName { get; set; } = null!;
public JobStatus Status { get; set; }
public string? ErrorMessage { get; set; }
public Dictionary<string, string>? Metadata { get; set; }
// Navigation
public Job? ParentJob { get; set; }
public List<Job> ChildJobs { get; set; } = [];
}

View File

@@ -0,0 +1,79 @@
using FictionArchive.Common.Extensions;
using FictionArchive.Service.ReportingService.Consumers;
using FictionArchive.Service.ReportingService.Services;
using FictionArchive.Service.ReportingService.GraphQL;
using FictionArchive.Service.Shared;
using FictionArchive.Service.Shared.Extensions;
namespace FictionArchive.Service.ReportingService;
public class Program
{
public static void Main(string[] args)
{
var builder = WebApplication.CreateBuilder(args);
var isSchemaExport = SchemaExportDetector.IsSchemaExportMode(args);
builder.AddLocalAppsettings();
builder.Services.AddHealthChecks();
#region MassTransit
if (!isSchemaExport)
{
builder.Services.AddFictionArchiveMassTransit(
builder.Configuration,
x =>
{
x.AddConsumer<JobStatusUpdateConsumer>();
});
}
#endregion
#region GraphQL
builder.Services.AddGraphQLServer()
.AddQueryConventions()
.AddTypeExtension(typeof(JobQueries))
.ApplySaneDefaults()
.AddAuthorization();
#endregion
#region Database
builder.Services.RegisterDbContext<ReportingDbContext>(
builder.Configuration.GetConnectionString("DefaultConnection"),
skipInfrastructure: isSchemaExport);
#endregion
// Authentication & Authorization
builder.Services.AddOidcAuthentication(builder.Configuration);
builder.Services.AddFictionArchiveAuthorization();
var app = builder.Build();
// Update database (skip in schema export mode)
if (!isSchemaExport)
{
using var scope = app.Services.CreateScope();
var dbContext = scope.ServiceProvider.GetRequiredService<ReportingDbContext>();
dbContext.UpdateDatabase();
}
app.UseHttpsRedirection();
app.MapHealthChecks("/healthz");
app.UseAuthentication();
app.UseAuthorization();
app.MapGraphQL();
app.RunWithGraphQLCommands(args);
}
}

View File

@@ -0,0 +1,23 @@
{
"$schema": "http://json.schemastore.org/launchsettings.json",
"profiles": {
"http": {
"commandName": "Project",
"dotnetRunMessages": true,
"launchBrowser": true,
"applicationUrl": "http://localhost:5140",
"environmentVariables": {
"ASPNETCORE_ENVIRONMENT": "Development"
}
},
"https": {
"commandName": "Project",
"dotnetRunMessages": true,
"launchBrowser": true,
"applicationUrl": "https://localhost:7310;http://localhost:5140",
"environmentVariables": {
"ASPNETCORE_ENVIRONMENT": "Development"
}
}
}
}

View File

@@ -0,0 +1,32 @@
using FictionArchive.Service.ReportingService.Models;
using FictionArchive.Service.Shared.Services.Database;
using Microsoft.EntityFrameworkCore;
namespace FictionArchive.Service.ReportingService.Services;
public class ReportingDbContext : FictionArchiveDbContext
{
public DbSet<Job> Jobs { get; set; }
public ReportingDbContext(DbContextOptions options, ILogger<ReportingDbContext> logger) : base(options, logger)
{
}
protected override void OnModelCreating(ModelBuilder modelBuilder)
{
base.OnModelCreating(modelBuilder);
modelBuilder.Entity<Job>(entity =>
{
entity.HasIndex(j => j.ParentJobId);
entity.Property(j => j.Metadata)
.HasColumnType("jsonb");
entity.HasOne(j => j.ParentJob)
.WithMany(j => j.ChildJobs)
.HasForeignKey(j => j.ParentJobId)
.OnDelete(DeleteBehavior.SetNull);
});
}
}

View File

@@ -0,0 +1,27 @@
{
"Logging": {
"LogLevel": {
"Default": "Information",
"Microsoft.AspNetCore": "Warning",
"Microsoft.EntityFrameworkCore": "Warning"
}
},
"ConnectionStrings": {
"DefaultConnection": "Host=localhost;Database=FictionArchive_Reporting;Username=postgres;password=postgres"
},
"RabbitMQ": {
"ConnectionString": "amqp://localhost",
"ClientIdentifier": "ReportingService"
},
"OIDC": {
"Authority": "https://auth.orfl.xyz/application/o/fiction-archive/",
"ClientId": "ldi5IpEidq2WW0Ka1lehVskb2SOBjnYRaZCpEyBh",
"Audience": "ldi5IpEidq2WW0Ka1lehVskb2SOBjnYRaZCpEyBh",
"ValidIssuer": "https://auth.orfl.xyz/application/o/fiction-archive/",
"ValidateIssuer": true,
"ValidateAudience": true,
"ValidateLifetime": true,
"ValidateIssuerSigningKey": true
},
"AllowedHosts": "*"
}

View File

@@ -0,0 +1,6 @@
{
"subgraph": "Reporting",
"http": {
"baseAddress": "http://localhost:5140/graphql"
}
}

View File

@@ -0,0 +1,23 @@
using FictionArchive.Common.Enums;
namespace FictionArchive.Service.Shared.Contracts.Events;
public interface IJobStatusUpdate
{
Guid JobId { get; }
Guid? ParentJobId { get; }
string JobType { get; }
string DisplayName { get; }
JobStatus Status { get; }
string? ErrorMessage { get; }
Dictionary<string, string>? Metadata { get; }
}
public record JobStatusUpdate(
Guid JobId,
Guid? ParentJobId,
string JobType,
string DisplayName,
JobStatus Status,
string? ErrorMessage,
Dictionary<string, string>? Metadata) : IJobStatusUpdate;

View File

@@ -1,6 +1,7 @@
using FictionArchive.Service.Shared.Services.Database; using FictionArchive.Service.Shared.Services.Database;
using Microsoft.EntityFrameworkCore; using Microsoft.EntityFrameworkCore;
using Microsoft.Extensions.DependencyInjection; using Microsoft.Extensions.DependencyInjection;
using Npgsql;
namespace FictionArchive.Service.Shared.Extensions; namespace FictionArchive.Service.Shared.Extensions;
@@ -21,9 +22,14 @@ public static class DatabaseExtensions
} }
else else
{ {
var dataSourceBuilder = new Npgsql.NpgsqlDataSourceBuilder(connectionString);
dataSourceBuilder.UseNodaTime();
dataSourceBuilder.UseJsonNet();
var dataSource = dataSourceBuilder.Build();
services.AddDbContext<TContext>(options => services.AddDbContext<TContext>(options =>
{ {
options.UseNpgsql(connectionString, o => options.UseNpgsql(dataSource, o =>
{ {
o.UseNodaTime(); o.UseNodaTime();
}); });

View File

@@ -0,0 +1,20 @@
using FictionArchive.Common.Enums;
using FictionArchive.Service.Shared.Contracts.Events;
using MassTransit;
namespace FictionArchive.Service.Shared.Extensions;
public static class JobStatusPublisher
{
public static Task ReportJobStatus(
this IPublishEndpoint endpoint,
Guid jobId,
string jobType,
string displayName,
JobStatus status,
Guid? parentJobId = null,
string? errorMessage = null,
Dictionary<string, string>? metadata = null)
=> endpoint.Publish<IJobStatusUpdate>(new JobStatusUpdate(
jobId, parentJobId, jobType, displayName, status, errorMessage, metadata));
}

View File

@@ -30,6 +30,7 @@
<PackageReference Include="NodaTime.Serialization.JsonNet" Version="3.2.0" /> <PackageReference Include="NodaTime.Serialization.JsonNet" Version="3.2.0" />
<PackageReference Include="Npgsql.EntityFrameworkCore.PostgreSQL" Version="9.0.4" /> <PackageReference Include="Npgsql.EntityFrameworkCore.PostgreSQL" Version="9.0.4" />
<PackageReference Include="Npgsql.EntityFrameworkCore.PostgreSQL.NodaTime" Version="9.0.4" /> <PackageReference Include="Npgsql.EntityFrameworkCore.PostgreSQL.NodaTime" Version="9.0.4" />
<PackageReference Include="Npgsql.Json.NET" Version="9.*" />
<PackageReference Include="Polly" Version="8.6.5" /> <PackageReference Include="Polly" Version="8.6.5" />
<PackageReference Include="MassTransit.RabbitMQ" Version="8.*" /> <PackageReference Include="MassTransit.RabbitMQ" Version="8.*" />
<PackageReference Include="Microsoft.AspNetCore.Authentication.JwtBearer" Version="8.0.11" /> <PackageReference Include="Microsoft.AspNetCore.Authentication.JwtBearer" Version="8.0.11" />

View File

@@ -1,6 +1,6 @@
 
Microsoft Visual Studio Solution File, Format Version 12.00 Microsoft Visual Studio Solution File, Format Version 12.00
# #
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "FictionArchive.Common", "FictionArchive.Common\FictionArchive.Common.csproj", "{ABF1BA10-9E76-45BE-9947-E20445A68147}" Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "FictionArchive.Common", "FictionArchive.Common\FictionArchive.Common.csproj", "{ABF1BA10-9E76-45BE-9947-E20445A68147}"
EndProject EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "FictionArchive.API", "FictionArchive.API\FictionArchive.API.csproj", "{420CC1A1-9DBC-40EC-B9E3-D4B25D71B9A9}" Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "FictionArchive.API", "FictionArchive.API\FictionArchive.API.csproj", "{420CC1A1-9DBC-40EC-B9E3-D4B25D71B9A9}"
@@ -23,6 +23,10 @@ Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "FictionArchive.Service.User
EndProject EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "FictionArchive.Service.UserNovelDataService", "FictionArchive.Service.UserNovelDataService\FictionArchive.Service.UserNovelDataService.csproj", "{A278565B-D440-4AB9-B2E2-41BA3B3AD82A}" Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "FictionArchive.Service.UserNovelDataService", "FictionArchive.Service.UserNovelDataService\FictionArchive.Service.UserNovelDataService.csproj", "{A278565B-D440-4AB9-B2E2-41BA3B3AD82A}"
EndProject EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "FictionArchive.Service.ReportingService", "FictionArchive.Service.ReportingService\FictionArchive.Service.ReportingService.csproj", "{F29F7969-2B40-4B92-A8F5-9544A4F700DC}"
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "FictionArchive.Service.ReportingService.Tests", "FictionArchive.Service.ReportingService.Tests\FictionArchive.Service.ReportingService.Tests.csproj", "{E704ACF1-2E1D-4A1C-BBCE-8FAE9F1A9944}"
EndProject
Global Global
GlobalSection(SolutionConfigurationPlatforms) = preSolution GlobalSection(SolutionConfigurationPlatforms) = preSolution
Debug|Any CPU = Debug|Any CPU Debug|Any CPU = Debug|Any CPU
@@ -73,5 +77,13 @@ Global
{A278565B-D440-4AB9-B2E2-41BA3B3AD82A}.Debug|Any CPU.Build.0 = Debug|Any CPU {A278565B-D440-4AB9-B2E2-41BA3B3AD82A}.Debug|Any CPU.Build.0 = Debug|Any CPU
{A278565B-D440-4AB9-B2E2-41BA3B3AD82A}.Release|Any CPU.ActiveCfg = Release|Any CPU {A278565B-D440-4AB9-B2E2-41BA3B3AD82A}.Release|Any CPU.ActiveCfg = Release|Any CPU
{A278565B-D440-4AB9-B2E2-41BA3B3AD82A}.Release|Any CPU.Build.0 = Release|Any CPU {A278565B-D440-4AB9-B2E2-41BA3B3AD82A}.Release|Any CPU.Build.0 = Release|Any CPU
{F29F7969-2B40-4B92-A8F5-9544A4F700DC}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{F29F7969-2B40-4B92-A8F5-9544A4F700DC}.Debug|Any CPU.Build.0 = Debug|Any CPU
{F29F7969-2B40-4B92-A8F5-9544A4F700DC}.Release|Any CPU.ActiveCfg = Release|Any CPU
{F29F7969-2B40-4B92-A8F5-9544A4F700DC}.Release|Any CPU.Build.0 = Release|Any CPU
{E704ACF1-2E1D-4A1C-BBCE-8FAE9F1A9944}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{E704ACF1-2E1D-4A1C-BBCE-8FAE9F1A9944}.Debug|Any CPU.Build.0 = Debug|Any CPU
{E704ACF1-2E1D-4A1C-BBCE-8FAE9F1A9944}.Release|Any CPU.ActiveCfg = Release|Any CPU
{E704ACF1-2E1D-4A1C-BBCE-8FAE9F1A9944}.Release|Any CPU.Build.0 = Release|Any CPU
EndGlobalSection EndGlobalSection
EndGlobal EndGlobal

View File

@@ -157,6 +157,20 @@ services:
condition: service_healthy condition: service_healthy
restart: unless-stopped restart: unless-stopped
reporting-service:
image: git.orfl.xyz/conco/fictionarchive-reporting-service:latest
networks:
- fictionarchive
environment:
ConnectionStrings__DefaultConnection: Host=postgres;Database=FictionArchive_Reporting;Username=${POSTGRES_USER:-postgres};Password=${POSTGRES_PASSWORD:-postgres}
RabbitMQ__ConnectionString: amqp://${RABBITMQ_USER:-guest}:${RABBITMQ_PASSWORD:-guest}@rabbitmq
depends_on:
postgres:
condition: service_healthy
rabbitmq:
condition: service_healthy
restart: unless-stopped
# =========================================== # ===========================================
# API Gateway # API Gateway
# =========================================== # ===========================================
@@ -179,6 +193,7 @@ services:
- file-service - file-service
- user-service - user-service
- usernoveldata-service - usernoveldata-service
- reporting-service
restart: unless-stopped restart: unless-stopped
# =========================================== # ===========================================