The Trace Logs are written into individual log files on disk, one for each calendar day.
When you use the backoffice start date and end date functionality to search these logs... all this does is work out which files to open and load into memory to be filtered...
... so there is no granular control over searching for a time period.
If you look at the logs you'll see there is a Timestamp property for each log entry, but this isn't a 'normal property' - if you look in the log files on disk you'll see it is actually represented as @t in the log files themselves...
I found that in a V8 site, I wasn't able to successfully use the @t, or Timestamp or @Timestamp in a filter query via the backoffice.
But in a V10 site, a filter of
Has(@t)
and
Has (@Timestamp)
did return records
Everytime though I tried to use Like or Starts With or > in a query with those fields, I just got no records...
I haven't seen anywhere online for an example of fitlering the timestamp!
what we've ended up doing on sites (because usually the log files are over a 1gb and can't be viewed via Umbraco backoffice) is to use a package called serilog-sinks-applicationinsights - https://github.com/serilog-contrib/serilog-sinks-applicationinsights to sync the Umbraco Logs into Microsoft Application Insights... which has a much richer syntax for querying data between specific moments in time.
But be interested if you do get to the bottom of it!
I've come to the same conclusion. I had tried every possible combination of search terms I could think of for filtering down to a specific TIME, and nothing worked.
When the log files have 5,000+ entries, you can't reasonably "manually" page through 30 pages of entries trying to get to the specific time an event happened. I also found no way to manually enter a specific page to move to, like a URL parameter, which would have helped.
Unfortunately this really limits the usability of the Log Viewer if you want to visually review ALL entries for a specific time frame.
This app is on Azure, so perhaps we will have to look at the Insights sink. We had considered Insights before, but balked as there was no way to Estimate cost unless you know exactly how much data you will be sending. Was it complicated to get working? How are costs?
Agree it's fairly unusable for large sites and Application Insights is amazing.
In terms of costs, it's based on ingestion of size of logs but you can set caps if you want to prevent it going out of control!
The largest site we have using it, uses about 350mb per day (we have a 3gb daily cap just in case)
it works out at around £25 per month on pay as you go.
But we also use it to provide stats on some of the functionality of the site, by logging when quotes and reservations are completed we can query Application Insights to see real time ish daily status on a dashboard in Umbraco with minimal implementation.
And you can also setup alerts, when specific errors happen or use AI to predict when an unhealthy amount of errors are occuring in a time period... etc to email us and alert us to issues.
So it's a bit more than just log storage!
If you do need it to purely to solve errors from the logs it is also very good, we have three different applications, azure functions all sharing the same application insights, which means we can cross reference a user journey across instances, which has really helped get to the bottom of things quickly.
Search Log Viewer by Date/Time
What is the proper syntax to search the Log Viewer by event date/time?
Umbraco v8
Hi Brad
This is tricky
The Trace Logs are written into individual log files on disk, one for each calendar day.
When you use the backoffice start date and end date functionality to search these logs... all this does is work out which files to open and load into memory to be filtered...
... so there is no granular control over searching for a time period.
If you look at the logs you'll see there is a Timestamp property for each log entry, but this isn't a 'normal property' - if you look in the log files on disk you'll see it is actually represented as @t in the log files themselves...
I found that in a V8 site, I wasn't able to successfully use the @t, or Timestamp or @Timestamp in a filter query via the backoffice.
But in a V10 site, a filter of
Has(@t)
and
Has (@Timestamp)
did return records
Everytime though I tried to use Like or Starts With or > in a query with those fields, I just got no records...
I haven't seen anywhere online for an example of fitlering the timestamp!
It's meant to be using the Serilog-expressions library: https://github.com/serilog/serilog-expressions
There is a standalone Log Viewer:
https://github.com/warrenbuckley/Compact-Log-Format-Viewer
I'm not sure if that would provide any joy...
what we've ended up doing on sites (because usually the log files are over a 1gb and can't be viewed via Umbraco backoffice) is to use a package called serilog-sinks-applicationinsights - https://github.com/serilog-contrib/serilog-sinks-applicationinsights to sync the Umbraco Logs into Microsoft Application Insights... which has a much richer syntax for querying data between specific moments in time.
But be interested if you do get to the bottom of it!
regards
marc
Marc,
Thanks for the reply.
I've come to the same conclusion. I had tried every possible combination of search terms I could think of for filtering down to a specific TIME, and nothing worked.
When the log files have 5,000+ entries, you can't reasonably "manually" page through 30 pages of entries trying to get to the specific time an event happened. I also found no way to manually enter a specific page to move to, like a URL parameter, which would have helped.
Unfortunately this really limits the usability of the Log Viewer if you want to visually review ALL entries for a specific time frame.
This app is on Azure, so perhaps we will have to look at the Insights sink. We had considered Insights before, but balked as there was no way to Estimate cost unless you know exactly how much data you will be sending. Was it complicated to get working? How are costs?
Hi Brad
Agree it's fairly unusable for large sites and Application Insights is amazing.
In terms of costs, it's based on ingestion of size of logs but you can set caps if you want to prevent it going out of control!
The largest site we have using it, uses about 350mb per day (we have a 3gb daily cap just in case)
it works out at around £25 per month on pay as you go.
But we also use it to provide stats on some of the functionality of the site, by logging when quotes and reservations are completed we can query Application Insights to see real time ish daily status on a dashboard in Umbraco with minimal implementation.
And you can also setup alerts, when specific errors happen or use AI to predict when an unhealthy amount of errors are occuring in a time period... etc to email us and alert us to issues.
So it's a bit more than just log storage!
If you do need it to purely to solve errors from the logs it is also very good, we have three different applications, azure functions all sharing the same application insights, which means we can cross reference a user journey across instances, which has really helped get to the bottom of things quickly.
Install is pretty straightforwards, there is info here: https://github.com/serilog-contrib/serilog-sinks-applicationinsights
But it is install the package and then add appsettings to tell 'Serilog' to use Serilog.Sinks.ApplicaionInsights
and to set TelemetryConverter to be "Serilog.Sinks.ApplicationInsights.TelemetryConverters.TraceTelemetryConverter, Serilog.Sinks.ApplicationInsights"
etc
regards
Marc
is working on a reply...