Copied to clipboard

Flag this post as spam?

This post will be reported to the moderators as potential spam to be looked at


  • Harrysoon 31 posts 208 karma points
    May 13, 2019 @ 09:13
    Harrysoon
    0

    Umbraco 8 SeriLog and SeriLog sinks

    Has anyone successfully used some SeriLog Sinks with Umbraco 8?

    I've tried to install the MSSqlServer Sink, to log to a DB table in Umbraco 8, but this doesn't seem to log anything when using Logger.Info, for example.

    I've followed the SeriLog and the Sink's documentation, installed the NuGet package and added the details to serilog.config (and serilog.user.config).

    <add key="serilog:using:MSSqlServer" value="Serilog.Sinks.MSSqlServer" />
    <add key="serilog:write-to:MSSqlServer.connectionString" value="server=<ip address>;database=<database>;user id=<user>;password='<password>'"/>
    <add key="serilog.write-to:MSSqlServer.tableName" value="Logs"/>
    <add key="serilog.write-to:MSSqlServer.autoCreateSqlTable" value="true"/>
    

    The "Logs" table is created using the schema found in the Sink's documentation. Just wondering if it is possible to use these with Umbraco's SeriLog integration?

  • jake williamson 169 posts 738 karma points
    Jan 22, 2020 @ 01:34
    jake williamson
    0

    hey there harrison,

    did you ever manage to get this working?

    we're trying to set up logging for specific classes to a database but like you, can't get anything to appear in the log...

    cheers,

    jake

  • Harrysoon 31 posts 208 karma points
    Jan 22, 2020 @ 09:55
    Harrysoon
    0

    Hey,

    Unfortunately I never managed to get this working. Tried various different methods, but nothing would log in to the DB at all. Tried some other SeriLog plugins too, and none seemed to work with Umbraco.

  • jake williamson 169 posts 738 karma points
    Jan 23, 2020 @ 06:45
    jake williamson
    0

    darn it! yeah, we've the same issue - we can write to new files but nothing else seems to work...

    we'll keep plugging away at it and if we get a fix, we'll post it here ;)

  • piiiiiiiiii 8 posts 28 karma points
    Jul 13, 2020 @ 14:04
    piiiiiiiiii
    0

    Hi, know it has been a while - but any progress? I only get error (when starting up):

    System.ArgumentNullException: Value cannot be null.
    Parameter name: tableName
    

    Although I have the parameter set up:

    <add key="serilog:using:MSSqlServer" value="Serilog.Sinks.MSSqlServer" />    
    <add key="serilog:write-to:MSSqlServer.connectionString" value="server=[ServerIP];Initial Catalog=[myDatabaseName];User id=[UserId];Password=[Password1]"/>
    <add key="serilog.write-to:MSSqlServer.tableName" value="Logs" />
    <add key="serilog.write-to:MSSqlServer.autoCreateSqlTable" value="true" /> 
    
  • jake williamson 169 posts 738 karma points
    Jul 14, 2020 @ 00:54
    jake williamson
    0

    hey there,

    i gave up with trying to write to a sql server table in the end... spent hours on it to no avail...

    however i did get writing to azure table storage working - which thinking about it it's probably faster than sql server (the only drawback being it's only useful if you have azure...)

    the main thing i found was you need to remove the json logger in the serilog.config and replace it with your own logger. if you configure it in the serilog.user.config logging will write to the json file AND the azure table...

    confusingly i thought the serilog.user.config file would be a way to configure multiple logs but there's no way to switch the log to view in the backoffice so for my purposes, it's kinda useless...

    so the first thing you'll need is the https://www.nuget.org/packages/Serilog.Sinks.AzureTableStorage/ package

    then edit the serilog.config replacing the json logger with:

    <add key="serilog:using:AzureTableStorage" value="Serilog.Sinks.AzureTableStorage" />
    <add key="serilog:write-to:AzureTableStorage.formatter" value="Serilog.Formatting.Compact.CompactJsonFormatter, Serilog.Formatting.Compact" />
    <add key="serilog:write-to:AzureTableStorage.connectionString" value="xxxFromAzurexxx" />
    <add key="serilog:write-to:AzureTableStorage.storageTableName" value="UmbracoLog" />
    

    couple of things to note:

    the Serilog.Formatting.Compact.CompactJsonFormatter, Serilog.Formatting.Compact will ensure the logging format matches the format the backoffice expects meaning you'll be able to view the log as normal

    the serilog:write-to:AzureTableStorage.storageTableName value of UmbracoLog is different to the default name that the plugin uses. you can set it to whatever you like

    the last piece of the puzzle is writing a class that displays the log in the backoffice:

    public class AzureTableLogViewer : LogViewerSourceBase
    {
        public override bool CanHandleLargeLogs => true;
    
        public override bool CheckCanOpenLogs(LogTimePeriod logTimePeriod)
        {
            //this method will not be called as we have indicated that this 'CanHandleLargeLogs'
            throw new NotImplementedException();
        }
    
        protected override IReadOnlyList<LogEvent> GetLogs(LogTimePeriod logTimePeriod, ILogFilter filter, int skip, int take)
        {
            var cloudStorage = CloudStorageAccount.Parse("xxxFromAzurexxx");
            var tableClient = cloudStorage.CreateCloudTableClient();
            var table = tableClient.GetTableReference("UmbracoLog");
    
            var logs = new List<LogEvent>();
            var count = 0;
    
            var query = new TableQuery<LogEventEntity>().Where(
                TableQuery.CombineFilters(
                    TableQuery.GenerateFilterConditionForDate("Timestamp", QueryComparisons.GreaterThanOrEqual, logTimePeriod.StartTime.Date),
                    TableOperators.And,
                    TableQuery.GenerateFilterConditionForDate("Timestamp", QueryComparisons.LessThanOrEqual, logTimePeriod.EndTime.Date.AddDays(1).AddSeconds(-1))
                    )
                );
    
            var results = table.ExecuteQuery(query);
    
            foreach (var entity in results)
            {
                var logItem = LogEventReader.ReadFromString(entity.Data);
    
                if (count > skip + take)
                {
                    break;
                }
    
                if (count < skip)
                {
                    count++;
                    continue;
                }
    
                if (filter.TakeLogEvent(logItem))
                {
                    logs.Add(logItem);
                }
    
                count++;
            }
    
            return logs;
        }
    
        public override IReadOnlyList<SavedLogSearch> GetSavedSearches()
        {
            return base.GetSavedSearches();
        }
    
        public override IReadOnlyList<SavedLogSearch> AddSavedSearch(string name, string query)
        {
            return base.AddSavedSearch(name, query);
        }
    
        public override IReadOnlyList<SavedLogSearch> DeleteSavedSearch(string name, string query)
        {
            return base.DeleteSavedSearch(name, query);
        }
    }
    

    this then needs to be wired up in your composer:

    [RuntimeLevel(MinLevel = RuntimeLevel.Run)]
    public class Logging : IUserComposer
    {
        public void Compose(Composition composition)
        {
            composition.SetLogViewer<AzureTableLogViewer>();
        }
    }
    

    it works well and based on the site i'm using it with (an ecommerce site that write a lot of json into the logs) it's pretty snappy

    hope that helps!

  • piiiiiiiiii 8 posts 28 karma points
    Jul 14, 2020 @ 08:10
    piiiiiiiiii
    0

    Hey Jake, Thanks for this!

  • jake williamson 169 posts 738 karma points
    Sep 01, 2020 @ 02:52
    jake williamson
    0

    we've hit a bit of a gotcha with the above code...

    the problem is that the AzureTableLogViewer isn't using efficient paging. if you end up with a log of logs in the table, the backoffice log viewer becomes unusable... this is due to the query loading ALL the data and then the paging being applied in the foreach (which sucks!).

    as it turns out, the paging for azure table storage is a bit more complex than expected. you need to pass Tabletoken values between the front and back end meaning the current implementation won't work.

    we're at a bit of a cross roads now: go back to trying to get sql server logging working in which case efficient paging becomes easier or work out azure table paging and possibly end up hacking the core to support it...

    all a bit painful!

Please Sign in or register to post replies

Write your reply to:

Draft