I just upgraded a site to 4.5.2 and overwrote the language file where I had previously changed en-GB to en-US. In the half day it took me to notice, there were about 12 new nodes published. Changing the culture back to en-US fixed most issues, but these 12 nodes have a date field that is now stored in the wrong format in the Examine index. For example, 8/31/2010 is stored as 20100831000000000 instead of 2010-08-31 00:00:00Z. This is causing an error every time one of these nodes shows up in the search results.
I've tried everything I can think of to get the index updated with the correct date format, but no luck. Even unpublishing the node, recycling the application pool, then republishing doesn't work. Anyone with a better understanding of Examine/Lucene have an idea?
I see, so nothing to do with the culture. I must have been using an older version of Examine (maybe 0.9.2.0) that got upgraded when I copied over the 4.5.2 bin folder. Will the new version in the next RC be a standard date format that .NET can parse as is?
You can try the latest source in Umbraco, that's got a revamped way which Examine handles dates and pushes them into Lucene.
Previously we were only storing text fields and using the Lucene.Net DateTools API to transform them and push them in. Now we're leaving it at a config level to define whether you want to have it handled as a date or left as a string, but it's currently not documented so you'll have to look into the code if you wanted to use it. But there's still the question of how to push that back into the search results since Lucene doesn't store information about what fields were added as numbers (aka dates) or strings.
Examine date fields in wrong culture
I just upgraded a site to 4.5.2 and overwrote the language file where I had previously changed en-GB to en-US. In the half day it took me to notice, there were about 12 new nodes published. Changing the culture back to en-US fixed most issues, but these 12 nodes have a date field that is now stored in the wrong format in the Examine index. For example, 8/31/2010 is stored as 20100831000000000 instead of 2010-08-31 00:00:00Z. This is causing an error every time one of these nodes shows up in the search results.
I've tried everything I can think of to get the index updated with the correct date format, but no luck. Even unpublishing the node, recycling the application pool, then republishing doesn't work. Anyone with a better understanding of Examine/Lucene have an idea?
Thanks,
Jeff
This is how Lucene stores dates. We're in the process of improving the date storage though and it will be improved in the next RC of Examine.
The date format is: yyyyMMddhhmmsshh if you're interested and you can use that to convert it back to a .NET date.
The isn't correct, since HH and hh conflicts.
I suggest the following instead:
I see, so nothing to do with the culture. I must have been using an older version of Examine (maybe 0.9.2.0) that got upgraded when I copied over the 4.5.2 bin folder. Will the new version in the next RC be a standard date format that .NET can parse as is?
Thanks for your help.
You can try the latest source in Umbraco, that's got a revamped way which Examine handles dates and pushes them into Lucene.
Previously we were only storing text fields and using the Lucene.Net DateTools API to transform them and push them in. Now we're leaving it at a config level to define whether you want to have it handled as a date or left as a string, but it's currently not documented so you'll have to look into the code if you wanted to use it. But there's still the question of how to push that back into the search results since Lucene doesn't store information about what fields were added as numbers (aka dates) or strings.
Otherwise you can use DateTime.ParseExact to convert the stored string into a DateTime object: http://msdn.microsoft.com/en-us/library/w2sa9yss.aspx
Have made a piece of helper code for this format
And some unit test
Wound up with something similar, but took Slace's advice and went with ParseExact - a little cleaner than the substrings
is working on a reply...