Has anyone successfully used some SeriLog Sinks with Umbraco 8?
I've tried to install the MSSqlServer Sink, to log to a DB table in Umbraco 8, but this doesn't seem to log anything when using Logger.Info, for example.
I've followed the SeriLog and the Sink's documentation, installed the NuGet package and added the details to serilog.config (and serilog.user.config).
The "Logs" table is created using the schema found in the Sink's documentation. Just wondering if it is possible to use these with Umbraco's SeriLog integration?
Unfortunately I never managed to get this working. Tried various different methods, but nothing would log in to the DB at all. Tried some other SeriLog plugins too, and none seemed to work with Umbraco.
i gave up with trying to write to a sql server table in the end... spent hours on it to no avail...
however i did get writing to azure table storage working - which thinking about it it's probably faster than sql server (the only drawback being it's only useful if you have azure...)
the main thing i found was you need to remove the json logger in the serilog.config and replace it with your own logger. if you configure it in the serilog.user.config logging will write to the json file AND the azure table...
confusingly i thought the serilog.user.config file would be a way to configure multiple logs but there's no way to switch the log to view in the backoffice so for my purposes, it's kinda useless...
the Serilog.Formatting.Compact.CompactJsonFormatter, Serilog.Formatting.Compact will ensure the logging format matches the format the backoffice expects meaning you'll be able to view the log as normal
the serilog:write-to:AzureTableStorage.storageTableName value of UmbracoLog is different to the default name that the plugin uses. you can set it to whatever you like
the last piece of the puzzle is writing a class that displays the log in the backoffice:
public class AzureTableLogViewer : LogViewerSourceBase
{
public override bool CanHandleLargeLogs => true;
public override bool CheckCanOpenLogs(LogTimePeriod logTimePeriod)
{
//this method will not be called as we have indicated that this 'CanHandleLargeLogs'
throw new NotImplementedException();
}
protected override IReadOnlyList<LogEvent> GetLogs(LogTimePeriod logTimePeriod, ILogFilter filter, int skip, int take)
{
var cloudStorage = CloudStorageAccount.Parse("xxxFromAzurexxx");
var tableClient = cloudStorage.CreateCloudTableClient();
var table = tableClient.GetTableReference("UmbracoLog");
var logs = new List<LogEvent>();
var count = 0;
var query = new TableQuery<LogEventEntity>().Where(
TableQuery.CombineFilters(
TableQuery.GenerateFilterConditionForDate("Timestamp", QueryComparisons.GreaterThanOrEqual, logTimePeriod.StartTime.Date),
TableOperators.And,
TableQuery.GenerateFilterConditionForDate("Timestamp", QueryComparisons.LessThanOrEqual, logTimePeriod.EndTime.Date.AddDays(1).AddSeconds(-1))
)
);
var results = table.ExecuteQuery(query);
foreach (var entity in results)
{
var logItem = LogEventReader.ReadFromString(entity.Data);
if (count > skip + take)
{
break;
}
if (count < skip)
{
count++;
continue;
}
if (filter.TakeLogEvent(logItem))
{
logs.Add(logItem);
}
count++;
}
return logs;
}
public override IReadOnlyList<SavedLogSearch> GetSavedSearches()
{
return base.GetSavedSearches();
}
public override IReadOnlyList<SavedLogSearch> AddSavedSearch(string name, string query)
{
return base.AddSavedSearch(name, query);
}
public override IReadOnlyList<SavedLogSearch> DeleteSavedSearch(string name, string query)
{
return base.DeleteSavedSearch(name, query);
}
}
this then needs to be wired up in your composer:
[RuntimeLevel(MinLevel = RuntimeLevel.Run)]
public class Logging : IUserComposer
{
public void Compose(Composition composition)
{
composition.SetLogViewer<AzureTableLogViewer>();
}
}
it works well and based on the site i'm using it with (an ecommerce site that write a lot of json into the logs) it's pretty snappy
we've hit a bit of a gotcha with the above code...
the problem is that the AzureTableLogViewer isn't using efficient paging. if you end up with a log of logs in the table, the backoffice log viewer becomes unusable... this is due to the query loading ALL the data and then the paging being applied in the foreach (which sucks!).
as it turns out, the paging for azure table storage is a bit more complex than expected. you need to pass Tabletoken values between the front and back end meaning the current implementation won't work.
we're at a bit of a cross roads now: go back to trying to get sql server logging working in which case efficient paging becomes easier or work out azure table paging and possibly end up hacking the core to support it...
Umbraco 8 SeriLog and SeriLog sinks
Has anyone successfully used some SeriLog Sinks with Umbraco 8?
I've tried to install the MSSqlServer Sink, to log to a DB table in Umbraco 8, but this doesn't seem to log anything when using Logger.Info, for example.
I've followed the SeriLog and the Sink's documentation, installed the NuGet package and added the details to serilog.config (and serilog.user.config).
The "Logs" table is created using the schema found in the Sink's documentation. Just wondering if it is possible to use these with Umbraco's SeriLog integration?
hey there harrison,
did you ever manage to get this working?
we're trying to set up logging for specific classes to a database but like you, can't get anything to appear in the log...
cheers,
jake
Hey,
Unfortunately I never managed to get this working. Tried various different methods, but nothing would log in to the DB at all. Tried some other SeriLog plugins too, and none seemed to work with Umbraco.
darn it! yeah, we've the same issue - we can write to new files but nothing else seems to work...
we'll keep plugging away at it and if we get a fix, we'll post it here ;)
Hi, know it has been a while - but any progress? I only get error (when starting up):
Although I have the parameter set up:
hey there,
i gave up with trying to write to a sql server table in the end... spent hours on it to no avail...
however i did get writing to azure table storage working - which thinking about it it's probably faster than sql server (the only drawback being it's only useful if you have azure...)
the main thing i found was you need to remove the json logger in the
serilog.config
and replace it with your own logger. if you configure it in theserilog.user.config
logging will write to the json file AND the azure table...confusingly i thought the
serilog.user.config
file would be a way to configure multiple logs but there's no way to switch the log to view in the backoffice so for my purposes, it's kinda useless...so the first thing you'll need is the https://www.nuget.org/packages/Serilog.Sinks.AzureTableStorage/ package
then edit the
serilog.config
replacing the json logger with:couple of things to note:
the
Serilog.Formatting.Compact.CompactJsonFormatter, Serilog.Formatting.Compact
will ensure the logging format matches the format the backoffice expects meaning you'll be able to view the log as normalthe
serilog:write-to:AzureTableStorage.storageTableName
value ofUmbracoLog
is different to the default name that the plugin uses. you can set it to whatever you likethe last piece of the puzzle is writing a class that displays the log in the backoffice:
this then needs to be wired up in your composer:
it works well and based on the site i'm using it with (an ecommerce site that write a lot of json into the logs) it's pretty snappy
hope that helps!
Hey Jake, Thanks for this!
we've hit a bit of a gotcha with the above code...
the problem is that the
AzureTableLogViewer
isn't using efficient paging. if you end up with a log of logs in the table, the backoffice log viewer becomes unusable... this is due to the query loading ALL the data and then the paging being applied in the foreach (which sucks!).as it turns out, the paging for azure table storage is a bit more complex than expected. you need to pass
Tabletoken
values between the front and back end meaning the current implementation won't work.we're at a bit of a cross roads now: go back to trying to get sql server logging working in which case efficient paging becomes easier or work out azure table paging and possibly end up hacking the core to support it...
all a bit painful!
Had the same issue and following connection string format solved it.
is working on a reply...