Logging in ConnectSoft Microservice Template¶
Purpose & Overview¶
Logging in the ConnectSoft Microservice Template is a core infrastructure component that provides comprehensive visibility into application behavior, performance, errors, and business operations. The template implements structured, contextual logging using Microsoft.Extensions.Logging abstractions with Serilog as the primary provider, ensuring logs are machine-parsable, human-readable, and fully integrated with observability systems.
Logging encompasses:
- Structured Logging: JSON-based, machine-readable log entries with message templates
- Context-Aware Logging: Automatic enrichment with trace IDs, correlation IDs, flow names, and operational metadata
- Multi-Layer Integration: Consistent logging across Domain, Application, Infrastructure, and API layers
- Provider Agnostic: Abstracted via
ILogger<T>for framework independence - Environment Configurable: Log levels, sinks, and formats driven by configuration
- Cross-Cutting Support: Automatic logging in REST, gRPC, background jobs, message consumers, and hosted services
- Security Conscious: Built-in redaction and PII protection
- Observability Ready: Integrated with OpenTelemetry, Application Insights, and distributed tracing
Logging Philosophy
Logging in ConnectSoft is treated as first-class infrastructure. Every log entry is structured, enriched with context, and designed for operational insight, debugging, compliance, and machine analysis. Logs are not optional—they are the living telemetry of the platform.
Architecture Overview¶
The logging architecture follows a layered abstraction model:
Logging System
├── Application Code
│ ├── ILogger<T> Injection (Domain, Application, Infrastructure)
│ ├── Message Templates (Structured Properties)
│ └── Scoped Context (Flow, ObjectId, CorrelationId)
├── Logging Abstraction
│ ├── Microsoft.Extensions.Logging (ILogger, ILoggerFactory)
│ └── Provider Registration (Serilog, Log4Net, OpenTelemetry)
├── Logging Pipeline
│ ├── Serilog Configuration (Sinks, Enrichers, Filters)
│ ├── Request Logging (HTTP/gRPC Middleware)
│ └── Exception Logging (Global Handlers)
└── Output Destinations
├── Console (Development, Docker)
├── File (Rotating Logs)
├── Seq (Local Dashboard)
├── Azure Application Insights (Production)
└── OpenTelemetry Exporters (OTLP, Jaeger, Prometheus)
Key Integration Points¶
| Layer | Component | Responsibility |
|---|---|---|
| Application Code | ILogger<T> |
Type-safe logging with message templates |
| Middleware | HTTP/gRPC Interceptors | Automatic request/response logging |
| Services | Domain Processors, Use Cases | Business operation logging |
| Infrastructure | Repository, External APIs | Infrastructure operation logging |
| Background Jobs | Hosted Services, Hangfire | Scheduled task logging |
| Messaging | MassTransit/NServiceBus Consumers | Message processing logging |
Microsoft.Extensions.Logging Foundation¶
ILogger Abstraction¶
All logging in ConnectSoft uses the ILogger<T> interface, ensuring framework independence and testability:
public class DefaultMicroserviceAggregateRootsProcessor : IMicroserviceAggregateRootsProcessor
{
private readonly ILogger<DefaultMicroserviceAggregateRootsProcessor> logger;
public DefaultMicroserviceAggregateRootsProcessor(
ILogger<DefaultMicroserviceAggregateRootsProcessor> logger)
{
this.logger = logger;
}
public async Task<IMicroserviceAggregateRoot> CreateMicroserviceAggregateRoot(
CreateMicroserviceAggregateRootInput input,
CancellationToken token = default)
{
this.logger.Here(log => log.LogInformation(
message: "Create MicroserviceAggregateRoot for {ObjectId} started...",
objectId: input.ObjectId));
try
{
// ... processing logic ...
this.logger.Here(log => log.LogInformation(
message: "Create MicroserviceAggregateRoot for {ObjectId} successfully completed...",
objectId: input.ObjectId));
}
catch (Exception ex)
{
this.logger.Here(log => log.LogError(
exception: ex,
message: "Failed to create the MicroserviceAggregateRoot with id : {ObjectId}",
objectId: input.ObjectId));
throw;
}
}
}
Log Levels¶
| Level | Use Case | Example |
|---|---|---|
Trace |
Detailed debugging information | Method entry/exit, variable values |
Debug |
Development-time diagnostic information | Domain flow steps, intermediate states |
Information |
General application flow | Use case completion, successful operations |
Warning |
Unexpected but recoverable events | Validation failures, retry attempts |
Error |
Error events requiring attention | Exceptions, failed operations |
Critical |
Critical failures requiring immediate action | System failures, data corruption |
Message Templates¶
Always use message templates (not string interpolation) for structured logging:
// ✅ GOOD - Structured property
_logger.LogInformation("Processing order {OrderId} for customer {CustomerId}", orderId, customerId);
// ❌ BAD - String interpolation loses structure
_logger.LogInformation($"Processing order {orderId} for customer {customerId}");
Message templates: - Enable property extraction for querying - Prevent accidental object serialization - Support redaction and masking - Preserve log structure across sinks
Log Scopes¶
Use scopes to add contextual properties to all logs within a scope:
using (this.logger.BeginScope(
new Dictionary<string, object>
{
["Flow"] = "CreateOrder",
["ObjectId"] = orderId,
["CorrelationId"] = correlationId
}))
{
// All logs within this scope include Flow, ObjectId, CorrelationId
this.logger.LogInformation("Starting order creation");
// ... processing ...
this.logger.LogInformation("Order creation completed");
}
Serilog Configuration¶
Bootstrap Logging¶
Serilog is configured in Program.cs for bootstrap logging (logs before DI container is ready):
#if Serilog
private static readonly Logger Logger = new LoggerConfiguration()
.Enrich.FromLogContext()
.WriteTo.Console()
.WriteTo.Debug()
.CreateLogger();
public static async Task<int> Main(string[] args)
{
try
{
// ... host creation ...
await host.RunAsync().ConfigureAwait(false);
return 0;
}
catch (Exception exception)
{
Logger.Fatal(exception, "ConnectSoft.MicroserviceTemplate.Application failed to start");
throw;
}
finally
{
await Log.CloseAndFlushAsync().ConfigureAwait(false);
}
}
#endif
Host Integration¶
Serilog is integrated with the host builder:
#if Serilog
.UseSerilog((hostingContext, services, configuration) =>
configuration
.WriteTo.Console()
.ReadFrom.Configuration(hostingContext.Configuration)
.ReadFrom.Services(services))
#endif
This enables:
- Configuration-driven setup via appsettings.json
- Dependency injection integration
- Automatic enricher registration
Service Registration¶
Serilog is registered via extension methods:
// SerilogLoggingExtensions.cs
internal static IServiceCollection AddAndConfigureSerilogLogging(
this IServiceCollection services,
IConfiguration configuration,
IWebHostEnvironment env)
{
ArgumentNullException.ThrowIfNull(services);
ArgumentNullException.ThrowIfNull(configuration);
ArgumentNullException.ThrowIfNull(env);
// Enable redaction of classified data
services.AddLogging(loggingBuilder =>
{
loggingBuilder.EnableRedaction();
// Configure logs directory for file sink
IEnumerable<KeyValuePair<string, string>> configuredSerilogSinks =
configuration.GetSection("Serilog:Using").AsEnumerable().ToList();
bool isSerilogFileSinkConfigured = configuredSerilogSinks
.Any(sink => "Serilog.Sinks.File".Equals(sink.Value, StringComparison.Ordinal));
if (isSerilogFileSinkConfigured)
{
string logsHomeDirectoryPath = Environment.GetEnvironmentVariable("LOGS_HOME");
if (string.IsNullOrWhiteSpace(logsHomeDirectoryPath) ||
!Directory.Exists(logsHomeDirectoryPath))
{
var currentWorkingDirectory = new DirectoryInfo(Directory.GetCurrentDirectory());
DirectoryInfo logsHomeDirectory = currentWorkingDirectory.CreateSubdirectory("Logs");
Environment.SetEnvironmentVariable("LOGS_HOME", logsHomeDirectory.FullName);
}
}
if (!env.IsDevelopment())
{
loggingBuilder.ClearProviders();
#if OpenTelemetry
loggingBuilder.AddOpenTelemetry(options =>
{
options.IncludeScopes = true;
options.ParseStateValues = true;
});
#endif
}
Serilog.Debugging.SelfLog.Enable(Console.Error);
// Add Serilog as logging provider
LoggerConfiguration loggerConfiguration = new LoggerConfiguration()
.ReadFrom.Configuration(configuration);
loggingBuilder.AddSerilog(
loggerConfiguration.CreateLogger(),
dispose: true);
});
return services;
}
Request Logging Middleware¶
Serilog request logging captures HTTP request/response information:
// SerilogLoggingExtensions.cs
internal static IApplicationBuilder UseMicroserviceSerilogRequestLogging(
this IApplicationBuilder application)
{
ArgumentNullException.ThrowIfNull(application);
// Write streamlined request completion events
application.UseSerilogRequestLogging(options =>
{
options.EnrichDiagnosticContext = EnrichDiagnosticContext;
options.MessageTemplate = "{Protocol} {RequestMethod} {RequestPath} responded {StatusCode} {ContentType} in {Elapsed:0.0000} ms";
#if HealthCheck
options.GetLevel = GetLevel; // Exclude health checks from verbose logging
#endif
});
return application;
}
private static void EnrichDiagnosticContext(
IDiagnosticContext diagnosticContext,
HttpContext httpContext)
{
var request = httpContext.Request;
var response = httpContext.Response;
// RequestPath, RequestMethod, StatusCode, RequestId, CorrelationId,
// ConnectionId and Elapsed are added by default
diagnosticContext.Set("Host", request.Host);
diagnosticContext.Set("Scheme", request.Scheme);
diagnosticContext.Set("Protocol", request.Protocol);
diagnosticContext.Set("QueryString", request.QueryString);
Endpoint endpoint = httpContext.GetEndpoint();
if (endpoint is not null)
{
diagnosticContext.Set("EndpointName", endpoint.DisplayName);
}
if (httpContext.User != null && httpContext.User.Identity != null)
{
diagnosticContext.Set("UserName", httpContext.User.Identity.Name);
diagnosticContext.Set("IsAuthenticated", httpContext.User.Identity.IsAuthenticated);
}
diagnosticContext.Set("ContentType", response.ContentType);
ResponseHeaders responseHeaders = response.GetTypedHeaders();
if (responseHeaders != null)
{
diagnosticContext.Set("ResponseDate", responseHeaders.Date);
}
}
#if HealthCheck
private static LogEventLevel GetLevel(
HttpContext httpContext,
double elapsedMilliseconds,
Exception exception)
{
if (exception is null && httpContext.Response.StatusCode <= 499)
{
if (IsHealthCheckEndpoint(httpContext))
{
// Health check endpoints are called frequently, mark as verbose
return LogEventLevel.Verbose;
}
return LogEventLevel.Information;
}
return LogEventLevel.Error;
}
private static bool IsHealthCheckEndpoint(HttpContext httpContext)
{
var endpoint = httpContext.GetEndpoint();
if (endpoint is not null)
{
return string.Equals(
endpoint.DisplayName,
"Health checks",
StringComparison.Ordinal);
}
return false;
}
#endif
Log Enrichers¶
Built-in Enrichers¶
Serilog enrichers add contextual metadata to every log entry:
| Enricher | Purpose | Output Field |
|---|---|---|
FromLogContext() |
Captures scoped properties | Flow, ObjectId, CorrelationId, custom properties |
WithEnvironmentName() |
Adds environment | Environment |
WithMachineName() |
Adds machine name | MachineName |
WithThreadId() |
Adds thread identifier | ThreadId |
WithProperty("Service", ...) |
Adds service name | Service |
Configuration¶
Enrichers are configured via appsettings.json:
{
"Serilog": {
"Enrich": [
"FromLogContext",
"WithMachineName",
"WithThreadId",
"WithEnvironmentName"
],
"Properties": {
"Service": "ConnectSoft.MicroserviceTemplate"
}
}
}
Custom Enrichers¶
Use LogContext.PushProperty for custom contextual enrichment:
using (LogContext.PushProperty("Flow", "CreateInvoice"))
using (LogContext.PushProperty("Phase", "Validation"))
using (LogContext.PushProperty("ObjectId", invoiceId))
{
_logger.LogInformation("Validating invoice {InvoiceId}", invoiceId);
// All logs in this scope include Flow, Phase, ObjectId
}
Semantic Logging Attributes¶
ConnectSoft promotes semantic logging with consistent attributes:
| Attribute | Purpose | Example |
|---|---|---|
Flow |
Logical domain or use case pipeline | "CreateOrder", "UserLogin", "InvoiceGeneration" |
Phase |
Step in the lifecycle | "Validation", "BusinessLogic", "Persistence", "Response" |
CorrelationId |
Custom ID across services/messages | Request correlation ID, message correlation ID |
ObjectId |
Domain object identifier | OrderId, UserId, InvoiceId |
Operation |
Specific method or use case name | Method name, operation name |
StatusCode |
Response or operation result | HTTP status, gRPC status, operation status |
Category |
Log event type | "Domain", "Infrastructure", "Security" |
Example Enriched Log¶
{
"timestamp": "2025-01-15T10:33:14.123Z",
"level": "Information",
"message": "Validating invoice {InvoiceId}",
"invoiceId": "INV-8833",
"flow": "InvoiceGeneration",
"phase": "Validation",
"correlationId": "xyz-9911",
"objectId": "INV-8833",
"service": "BillingMicroservice",
"environment": "Production",
"machineName": "kube-node-1",
"threadId": 12,
"traceId": "00-b93fdc...",
"spanId": "f382ab12"
}
Log Sinks¶
Console Sink (Default)¶
The console sink outputs structured JSON logs to standard output:
{
"Serilog": {
"WriteTo": [
{
"Name": "Console",
"Args": {
"formatter": "Serilog.Formatting.Compact.RenderedCompactJsonFormatter, Serilog.Formatting.Compact"
}
}
]
}
}
Console output format:
{"@t":"2025-01-15T15:12:11Z","@l":"Information","@m":"Created order {OrderId}","OrderId":"ORD-9988"}
File Sink (Optional)¶
File sink writes rotating log files:
{
"Serilog": {
"Using": ["Serilog.Sinks.File"],
"WriteTo": [
{
"Name": "File",
"Args": {
"path": "Logs/log-.json",
"rollingInterval": "Day",
"retainedFileCountLimit": 30,
"formatter": "Serilog.Formatting.Compact.CompactJsonFormatter, Serilog.Formatting.Compact"
}
}
]
}
}
File sink configuration:
- rollingInterval: "Day", "Hour", "Infinite"
- retainedFileCountLimit: Maximum number of files to keep
- fileSizeLimitBytes: Maximum file size before rotation
Seq Sink (Optional)¶
Seq sink sends logs to a Seq server for real-time search and analysis:
{
"Serilog": {
"Using": ["Serilog.Sinks.Seq"],
"WriteTo": [
{
"Name": "Seq",
"Args": {
"serverUrl": "http://localhost:5341",
"apiKey": "optional-api-key"
}
}
]
}
}
Seq features: - Real-time log search - Structured query language - Dashboard and alerts - Log retention policies
Azure Application Insights Sink¶
Application Insights integration is configured separately:
// AzureApplicationInsightsExtensions.cs
public static IServiceCollection AddAzureApplicationInsights(
this IServiceCollection services,
IConfiguration configuration)
{
var connectionString = configuration["ApplicationInsights:ConnectionString"];
if (!string.IsNullOrWhiteSpace(connectionString))
{
services.AddApplicationInsightsTelemetry(options =>
{
options.ConnectionString = connectionString;
options.EnableAdaptiveSampling = false;
});
}
return services;
}
Configuration:
{
"ApplicationInsights": {
"ConnectionString": "InstrumentationKey=...;IngestionEndpoint=https://..."
}
}
Multi-Sink Configuration¶
Multiple sinks can be configured simultaneously:
{
"Serilog": {
"WriteTo": [
{
"Name": "Console",
"Args": {
"formatter": "Serilog.Formatting.Compact.RenderedCompactJsonFormatter"
}
},
{
"Name": "File",
"Args": {
"path": "Logs/log-.json",
"rollingInterval": "Day"
}
},
{
"Name": "Seq",
"Args": {
"serverUrl": "http://localhost:5341"
}
}
]
}
}
HTTP Request Logging¶
The template supports two complementary HTTP logging mechanisms:
- ASP.NET Core HTTP Logging: Low-level HTTP protocol logging with detailed request/response information
- Serilog Request Logging: High-level structured request completion logging with enriched context
Serilog Request Logging¶
Serilog request logging provides structured request completion events:
// SerilogLoggingExtensions.cs
internal static IApplicationBuilder UseMicroserviceSerilogRequestLogging(
this IApplicationBuilder application)
{
ArgumentNullException.ThrowIfNull(application);
// Write streamlined request completion events
application.UseSerilogRequestLogging(options =>
{
options.EnrichDiagnosticContext = EnrichDiagnosticContext;
options.MessageTemplate = "{Protocol} {RequestMethod} {RequestPath} responded {StatusCode} {ContentType} in {Elapsed:0.0000} ms";
#if HealthCheck
options.GetLevel = GetLevel; // Exclude health checks from verbose logging
#endif
});
return application;
}
Features: - Structured JSON logging with enriched context - Automatic correlation with trace IDs - Performance metrics (duration, status codes) - Business-level context (Flow, CorrelationId, ObjectId)
See the Request Logging Middleware section above for detailed Serilog request logging documentation.
ASP.NET Core HTTP Logging¶
ASP.NET Core HTTP logging provides low-level protocol logging for debugging and analysis.
See HTTP Logging for comprehensive documentation on: - ASP.NET Core HTTP logging configuration - Logging fields and body limits - Security considerations and sensitive data protection - Performance optimization - Integration with observability systems
gRPC Logging¶
gRPC Logging Interceptor¶
gRPC calls are logged via a custom interceptor:
// GrpcServerLoggingInterceptor.cs
public class GrpcServerLoggingInterceptor(ILogger<GrpcServerLoggingInterceptor> logger)
: Interceptor
{
private readonly ILogger<GrpcServerLoggingInterceptor> logger = logger;
public override async Task<TResponse> UnaryServerHandler<TRequest, TResponse>(
TRequest request,
ServerCallContext context,
UnaryServerMethod<TRequest, TResponse> continuation)
{
using (this.logger.BeginScope(
new Dictionary<string, object>(StringComparer.Ordinal)
{
["GrpcMethod"] = context.Method,
}))
{
try
{
this.logger.LogInformation("gRPC call method {GrpcMethod}.", context.Method);
var response = await base.UnaryServerHandler(request, context, continuation)
.ConfigureAwait(false);
this.logger.LogInformation("gRPC call method {GrpcMethod} completed.", context.Method);
return response;
}
catch (Exception exception)
{
this.logger.LogError(
exception,
"Error occurred during gRPC call method {GrpcMethod}.",
context.Method);
throw;
}
}
}
public override async Task ServerStreamingServerHandler<TRequest, TResponse>(
TRequest request,
IServerStreamWriter<TResponse> responseStream,
ServerCallContext context,
ServerStreamingServerMethod<TRequest, TResponse> continuation)
{
using (this.logger.BeginScope(
new Dictionary<string, object>(StringComparer.Ordinal)
{
["GrpcMethod"] = context.Method,
}))
{
try
{
this.logger.LogInformation("gRPC call method {GrpcMethod}.", context.Method);
await base.ServerStreamingServerHandler(request, responseStream, context, continuation)
.ConfigureAwait(false);
this.logger.LogInformation("gRPC call method {GrpcMethod} completed.", context.Method);
}
catch (Exception exception)
{
this.logger.LogError(
exception,
"Error occurred during gRPC call method {GrpcMethod}.",
context.Method);
throw;
}
}
}
}
Registration¶
// GrpcExtensions.cs
services.AddGrpc(options =>
{
options.Interceptors.Add<GrpcServerLoggingInterceptor>();
});
Logged Information¶
- gRPC method name (full path)
- Request processing status
- Exceptions with stack traces
- Execution duration (via Serilog request logging)
Log Configuration¶
appsettings.json Structure¶
Logging is configured via appsettings.json and environment-specific overrides:
{
"Serilog": {
"Using": ["Serilog.Sinks.Console"],
"MinimumLevel": {
"Default": "Information",
"Override": {
"Microsoft": "Warning",
"System": "Warning",
"Grpc": "Warning",
"Microsoft.Hosting.Lifetime": "Information"
}
},
"WriteTo": [
{
"Name": "Console",
"Args": {
"formatter": "Serilog.Formatting.Compact.RenderedCompactJsonFormatter, Serilog.Formatting.Compact"
}
}
],
"Enrich": [
"FromLogContext",
"WithMachineName",
"WithThreadId",
"WithEnvironmentName"
],
"Properties": {
"Service": "ConnectSoft.MicroserviceTemplate"
}
},
"HttpLogging": {
"Enabled": true
}
}
Environment-Specific Configuration¶
| File | Use Case |
|---|---|
appsettings.json |
Base configuration |
appsettings.Development.json |
Local development (verbose, Seq enabled) |
appsettings.Production.json |
Production (minimal, Application Insights) |
appsettings.Docker.json |
Container environments |
appsettings.Test.json |
Integration tests |
Log Level Filtering¶
Control verbosity per namespace:
{
"Serilog": {
"MinimumLevel": {
"Default": "Information",
"Override": {
"Microsoft": "Warning",
"Microsoft.AspNetCore": "Warning",
"System": "Warning",
"Grpc": "Warning",
"ConnectSoft.MicroserviceTemplate.Domain": "Debug"
}
}
}
}
Benefits: - Reduce framework noise - Focus on application logs - Control verbosity per component
Dynamic Configuration¶
Configuration is loaded at startup and can be reloaded:
.UseSerilog((hostingContext, services, configuration) =>
configuration
.ReadFrom.Configuration(hostingContext.Configuration, reloadOnChange: true)
.ReadFrom.Services(services))
OpenTelemetry Integration¶
Trace-Linked Logging¶
OpenTelemetry automatically enriches logs with trace context:
// Program.cs
#if OpenTelemetry
logging.AddOpenTelemetry(options =>
{
options.IncludeScopes = true;
options.ParseStateValues = true;
});
#endif
Automatic Enrichment¶
When OpenTelemetry is enabled, logs automatically include:
- traceId: Distributed trace identifier
- spanId: Current span identifier
- parentId: Parent span identifier (when applicable)
Example Trace-Linked Log¶
{
"timestamp": "2025-01-15T17:22:58Z",
"level": "Information",
"message": "Inventory check complete for Order {OrderId}",
"orderId": "ORD-7788",
"traceId": "00-ab12cd34...",
"spanId": "1a2b3c4d",
"parentId": "00-xyz...",
"flow": "Order.Create",
"service": "OrderService"
}
Benefits¶
- End-to-End Visibility: Every log line linked to a trace span
- Root Cause Isolation: See logs before/after failures in trace context
- Cross-Service Diagnostics: Correlate logs across service boundaries
- Telemetry-to-Log Drilldown: Navigate from traces/metrics to specific log records
Background Jobs and Hosted Services¶
Standard Pattern¶
Background jobs use the same ILogger<T> pattern:
public class MyBackgroundJob
{
private readonly ILogger<MyBackgroundJob> logger;
public MyBackgroundJob(ILogger<MyBackgroundJob> logger)
{
this.logger = logger;
}
public async Task RunAsync(Guid batchId)
{
using (this.logger.BeginScope(
new Dictionary<string, object>
{
["Flow"] = "BatchProcessing",
["ObjectId"] = batchId,
["Phase"] = "Execution"
}))
{
this.logger.LogInformation("Started batch job {BatchId}.", batchId);
try
{
// Process batch
await ProcessBatchAsync(batchId);
this.logger.LogInformation("Batch job {BatchId} completed successfully.", batchId);
}
catch (Exception ex)
{
this.logger.LogError(ex, "Batch job {BatchId} failed.", batchId);
throw;
}
}
}
}
Retry-Aware Logging¶
For retry scenarios, log attempt numbers:
int attempt = 1;
int maxAttempts = 3;
while (attempt <= maxAttempts)
{
try
{
await ProcessAsync();
break;
}
catch (Exception ex) when (attempt < maxAttempts)
{
this.logger.LogWarning(
ex,
"Retrying operation {Attempt}/{MaxAttempts}",
attempt,
maxAttempts);
attempt++;
}
}
Hangfire Integration¶
Hangfire jobs use standard logging:
[AutomaticRetry(Attempts = 3)]
public class ReconcileDailyStatementsJob
{
private readonly ILogger<ReconcileDailyStatementsJob> logger;
public async Task ExecuteAsync()
{
using (this.logger.BeginScope(
new Dictionary<string, object>
{
["JobName"] = "ReconcileDailyStatements",
["Flow"] = "Finance.Reconciliation"
}))
{
this.logger.LogInformation("Reconciling statements...");
// ... processing ...
}
}
}
Safe Logging & Redaction¶
Principles¶
ConnectSoft logging includes built-in protections against PII and secret leakage:
| Area | Strategy |
|---|---|
| Request/Response Logging | Body logging disabled by default, limited when enabled |
| Structured Logging | Message templates prevent accidental object dumps |
| Custom Redaction | Extension methods for masking sensitive values |
| Use Case Logs | Never log input DTOs directly—extract safe fields |
| Correlation, not Identity | Use traceId, correlationId—not user email, token, IP address |
Redaction Extensions¶
Create extension methods for safe logging:
public static class RedactionExtensions
{
public static string Redact(this string input) => "***";
public static string MaskEmail(this string email) =>
email != null && email.Contains('@')
? Regex.Replace(email, @"(?<=.{2}).(?=[^@]*?@)", "*")
: email;
public static string MaskCreditCard(this string cardNumber) =>
cardNumber != null && cardNumber.Length >= 4
? "****-****-****-" + cardNumber.Substring(cardNumber.Length - 4)
: cardNumber;
}
Safe Logging Pattern¶
// ❌ DON'T: Log raw input
_logger.LogInformation($"Creating user with email {input.Email}");
// ✅ DO: Extract and mask sensitive fields
_logger.LogInformation(
"Creating user with email {MaskedEmail}",
input.Email.MaskEmail());
// ✅ DO: Use correlation IDs instead of identity
using (LogContext.PushProperty("CorrelationId", correlationId))
{
_logger.LogInformation("Processing request");
}
Sensitive Fields to Avoid¶
| Field | Action |
|---|---|
Authorization, ApiKey |
Redact or omit |
Password, OldPassword, Token |
Never log |
Email, PhoneNumber |
Mask unless business-safe |
SessionId, DeviceId |
Correlate via hash |
CreditCard, SSN, AccountNumber |
Always mask |
HTTP Logging Security¶
Configure HTTP logging to exclude sensitive headers:
services.AddHttpLogging(logging =>
{
logging.LoggingFields = HttpLoggingFields.RequestProperties |
HttpLoggingFields.ResponseProperties;
logging.RequestBodyLogLimit = 0; // Disable body logging
logging.ResponseBodyLogLimit = 0;
logging.RequestHeaders.Add("X-Custom-Header");
logging.RequestHeaders.Remove("Authorization"); // Exclude auth headers
});
Application Insights Integration¶
Setup¶
Application Insights integration is configured via extension:
// AzureApplicationInsightsExtensions.cs
public static IServiceCollection AddAzureApplicationInsights(
this IServiceCollection services,
IConfiguration configuration)
{
var connectionString = configuration["ApplicationInsights:ConnectionString"];
if (!string.IsNullOrWhiteSpace(connectionString))
{
services.AddApplicationInsightsTelemetry(options =>
{
options.ConnectionString = connectionString;
options.EnableAdaptiveSampling = false; // Disable for full log flow
});
}
return services;
}
Configuration¶
{
"ApplicationInsights": {
"ConnectionString": "InstrumentationKey=...;IngestionEndpoint=https://..."
}
}
What Gets Sent¶
| Signal Type | Captured |
|---|---|
| Trace logs | Via ILogger<T> and Serilog |
| Unhandled exceptions | Auto-captured |
| Request telemetry | HTTP/gRPC requests |
| Dependency telemetry | External HTTP calls, database operations |
| Custom events | Via TelemetryClient |
Trace Correlation¶
Logs are automatically correlated with traces using traceId → operation_Id mapping. All logs and traces for a request share the same operation_Id in Application Insights.
Kusto Query Example¶
traces
| where message contains "Order created"
| project timestamp, message, severityLevel, customDimensions.Flow, operation_Id
Custom dimensions (Flow, TraceId, ServiceName) are indexed as customDimensions.
Log4Net Integration (Alternative)¶
For environments requiring Log4Net:
// Program.cs
#if Log4Net
logging.AddLog4Net(new Log4NetProviderOptions
{
ExternalConfigurationSetup = true,
});
#endif
// Log4NetLoggingExtensions.cs
internal static void UseLog4Net(this ILoggerFactory loggerFactory)
{
ArgumentNullException.ThrowIfNull(loggerFactory);
loggerFactory.AddLog4Net();
}
Log4Net configuration is external (typically log4net.config).
Testing¶
Mocking ILogger¶
var mockLogger = new Mock<ILogger<MyService>>();
var service = new MyService(mockLogger.Object);
// Verify log calls
mockLogger.Verify(
x => x.Log(
LogLevel.Information,
It.IsAny<EventId>(),
It.Is<It.IsAnyType>((v, t) => v.ToString().Contains("Expected message")),
It.IsAny<Exception>(),
It.Is<Func<It.IsAnyType, Exception, string>>((v, t) => true)),
Times.Once);
Test Logger Sink¶
For integration tests, use a test logger sink:
public class TestLoggerSink : ILogEventSink
{
public List<LogEvent> Events { get; } = new List<LogEvent>();
public void Emit(LogEvent logEvent)
{
Events.Add(logEvent);
}
}
// In test setup
var testSink = new TestLoggerSink();
Log.Logger = new LoggerConfiguration()
.WriteTo.Sink(testSink)
.CreateLogger();
// In test
Assert.IsTrue(testSink.Events.Any(e =>
e.MessageTemplate.Text.Contains("Expected message")));
Best Practices¶
Do's¶
- Use
ILogger<T>Everywhere - Inject via constructor
- Use message templates (not string interpolation)
-
Include contextual properties
-
Apply Scoped Context
-
Use Appropriate Log Levels
Information: Successful operations, important business eventsWarning: Recoverable errors, validation failuresError: Exceptions, failed operations-
Debug: Development-time diagnostics -
Enrich with Semantic Attributes
- Always include
Flowfor use case identification - Add
ObjectIdfor entity-specific queries -
Use
CorrelationIdfor cross-service tracing -
Configure Log Levels by Environment
-
Redact Sensitive Data
- Never log passwords, tokens, credit cards
- Mask emails, phone numbers when necessary
- Use correlation IDs instead of identity
Don'ts¶
-
Don't Use String Interpolation
-
Don't Log Sensitive Data
-
Don't Log Full Exception Details at Info Level
-
Don't Log in Loops Without Summarization
-
Don't Skip Contextual Enrichment
Troubleshooting¶
Common Issues¶
| Issue | Symptom | Solution |
|---|---|---|
| Logs Not Appearing | No console/file output | Check Serilog configuration, verify sinks are registered |
| Missing Context | Logs lack Flow, TraceId |
Ensure FromLogContext() enricher, verify scopes |
| Too Verbose | Excessive framework logs | Configure MinimumLevel.Override for Microsoft, System |
| Missing Trace IDs | Logs lack trace correlation | Verify OpenTelemetry integration, check IncludeScopes |
| Performance Impact | Slow logging | Use appropriate log levels, avoid synchronous file I/O |
| Sensitive Data Leakage | PII in logs | Review message templates, enable redaction, disable body logging |
Debug Logging¶
Enable Serilog self-logging to diagnose issues:
This outputs Serilog internal errors to console.
Configuration Validation¶
Verify configuration is loaded correctly:
var serilogSection = configuration.GetSection("Serilog");
if (!serilogSection.Exists())
{
throw new InvalidOperationException("Serilog configuration missing");
}
Summary¶
Logging in the ConnectSoft Microservice Template provides:
- ✅ Structured Logging: JSON-based, machine-readable entries
- ✅ Context-Aware: Automatic enrichment with trace IDs, correlation IDs, flow names
- ✅ Provider Agnostic: Abstracted via
ILogger<T>for flexibility - ✅ Multi-Sink Support: Console, File, Seq, Application Insights, OpenTelemetry
- ✅ Security Conscious: Built-in redaction and PII protection
- ✅ Observability Ready: Integrated with distributed tracing
- ✅ Environment Configurable: Per-environment log levels and sinks
- ✅ Comprehensive Coverage: Automatic logging for HTTP, gRPC, background jobs, messaging
By following these patterns, microservices achieve:
- Visibility — Complete operational insight into application behavior
- Traceability — End-to-end request tracking across services
- Debugging — Rich contextual information for troubleshooting
- Compliance — Audit-ready logs with PII protection
- Performance — Minimal overhead with appropriate log levels
- Scalability — Structured logs suitable for centralized aggregation
The logging infrastructure ensures that ConnectSoft microservices are observable, debuggable, and production-ready across any deployment environment.