Press Ctrl / CMD + C to copy this to your clipboard.
This post will be reported to the moderators as potential spam to be looked at
we're working with Umbraco 7.6.4, hosted in an Azure Web App (Large), and we're facing an issue when loading a website with more than 250K nodes.
We're a importing data from a legacy custom system and, until we reached all those nodes, it was working as expected.
We're using Umbraco as backend content editor to add/edit documents, and all the documents are then stored in SQL Server+ElasticSearch and retrieved from another frontend website.
When starting/restarting the Umbraco site, it tries to create/update the umbraco.config file and "sometimes" fails. From the log, we sometimes obtain either an OutOfMemoryException ERROR umbraco.content - Error Republishing
System.OutOfMemoryException: Exception of type 'System.OutOfMemoryException' was thrown, a DB timeout Error Republishing
System.Data.SqlClient.SqlException (0x80131904): Execution Timeout Expired. The timeout period elapsed prior to completion of the operation or the server is not responding. or a XMLCorruptedException ERROR Umbraco.Web.Routing.UrlProvider - GetUrl exception.
System.Exception: The Xml cache is corrupt. Use the Health Check data integrity dashboard to fix it. and, obviously, the site doesn't start.
We have already tried to exclude all 'heavy' nodes from all indexes, because it was very slow. But the problem with the umbraco.config remains... (the created file is 1.1GB) and we have to insert even more data (> 500K nodes)
Is there a known limit on the maximum number of nodes?
Is there a way to avoid the umbraco.config file creation/update?
Are there any settings to be checked or changed to support all those nodes?
Anna M. Serra
Did you find a solution?
is working on a reply...
Write your reply to:
Image will be uploaded when post is submitted