Copied to clipboard

Flag this post as spam?

This post will be reported to the moderators as potential spam to be looked at


  • Matthew 14 posts 84 karma points
    Sep 27, 2018 @ 08:42
    Matthew
    0

    Umbraco 7 - ContentService API saving speed

    Hi,

    I using the following to save a content item programmatically....

    ApplicationContext .Current .Services .ContentService .SaveAndPublishWithStatus(contentItem);

    The item saves and index is updated. The issue is speed. It seems painfully slow. Front end hangs. Only saving a single item with around 20 or so fields. Having debugged it’s definitely this operation which is causing the hold up. Been pulling my remaining hair out trying to figure out why this runs slowly.

    Am sure the operation should be much speedier than I am seeing. Also sure I am doing something wrong. Is there anything I can try?

    Cheers! Matt

  • Dan Diplo 1554 posts 6205 karma points MVP 6x c-trib
    Sep 27, 2018 @ 12:51
    Dan Diplo
    0

    Unfortunately, updating content via the API can be slow, and there's not a lot you can do about it. A single content update can involve multiple SQL inserts/updates as well as triggering things like Examine reindexing, updating the XML (umbraco.config) cache etc.

    The only slight optimisation you can provide is to prevent the raising of events by calling the overload ie.

    ContentService.SaveAndPublishWithStatus(content, raiseEvents: false);
    
  • Matthew 14 posts 84 karma points
    Sep 28, 2018 @ 01:02
    Matthew
    0

    Hi Dan,

    Thanks for the reply to this. Am I right in thinking to set raised events to false there would be no indexing (examine). I can certainly give this a try.

    You see I am reading content from the examine index - so I need to ensure these items get saved and published.

    The bottleneck seems to be these sql calls. Is there any optimisation you recommend on the database itself? Table index’s? (Clustered / non clustered) etc .... anything I might not of thought of?

    Any alternate ways to save and publish a content item? Packages? ...

    Generally the site all runs fine, but the big let down is the adding items (house listings) which is causing big performance issues.

  • Dan Diplo 1554 posts 6205 karma points MVP 6x c-trib
    Sep 28, 2018 @ 09:28
    Dan Diplo
    0

    I think examine indexing still occurs if you don't raise events - I think it only applies to custom events you might have added.

    Usually database indexes will improve read perform of select queries but will actually slow down inserts - so there's no benefit to adding more indexes (but don't remove any existing ones, otherwise this will hurt performance elsewhere). See https://use-the-index-luke.com/sql/dml/insert

    The only things I would suggest are:

    1. Add your items on a schedule, preferably at night or when demand on the site is low

    2. Only add items that don't already exist (use a hash of fields to determine this)

    3. You could try adding all the items and saving them but not publishing. Once you have added all items, you can try publishing the parent node with descendants.

  • John Bergman 483 posts 1132 karma points
    Sep 28, 2018 @ 01:26
    John Bergman
    0

    Depending on what you are doing, I use a pattern where I wrap what I am executing within a using statement, each operation that updates the data successfully (assuming there is a change) sets a flag inside the disposable object created by the using construct. When the using statement is disposed, and the flag has been set, that is when I actually perform the save.

    This allows me to batch up changes on the same content without having to worry about whether the data needs to be saved again or not; and I only save when there actually was a change made.

  • Matthew 14 posts 84 karma points
    Sep 28, 2018 @ 01:43
    Matthew
    0

    Hi John, thanks for the reply.

    The items I am trying to save are generally new item(s). The building of the content item with the relevant fields (props) ... listingName, listingAddress, listingImage etc .... is generally very quick ... but once I hit this method the fun starts and the bottleneck is that SaveAndPublishWithStatus call.

           /// <summary>
        /// save the content to the database
        /// </summary>
        public void SaveContent(IContent contentItem)
        {
    
    
            if (contentItem.Status == ContentStatus.Published)
            {
    
                ApplicationContext
                    .Current
                    .Services
                    .ContentService
                    .SaveAndPublishWithStatus(contentItem);
            }
            else
            {
                ApplicationContext
                    .Current
                    .Services
                    .ContentService
                    .Save(contentItem);
            }
    
            contentItem = null;
            contentItem.DisposeIfDisposable();
            GC.Collect();
    
        }
    

    I was looking at SQL profiler on my local and can see a while bunch of Umbraco calls being made. Not sure why so much is going on in order to save the item to the database, but still.

    So not doing anything special. Looking at my code (and your ideas) is there anything I could / should be doing ? .....

  • Dan Diplo 1554 posts 6205 karma points MVP 6x c-trib
    Sep 28, 2018 @ 09:23
    Dan Diplo
    0

    I'd remove calls to GC.Collect() - you should never have any reason to call this explicitly.

    See https://blogs.msdn.microsoft.com/ricom/2003/12/02/two-things-to-avoid-for-better-memory-usage/

  • Matthew 14 posts 84 karma points
    Sep 28, 2018 @ 14:16
    Matthew
    0

    Thanks Dan. Yes, have removed that line. Not sure why it was in there, but still - good article explained things well.

    I was plucking at straws a little by mentioning the indexes. I can see Umbraco have already put the relevant and required indexes in place and these are for reading in any case.

    The items being added are single items by an end user. So need to be added and published at that moment. I could save, then publish later but for now I need to save and publish at once.

    The item will contain around 20-30 properties. Could any of these be slowing the operation ? .... they are generally strings. Nothing fancy - or any large blobs of data.

  • Steve Morgan 1348 posts 4457 karma points c-trib
    Sep 28, 2018 @ 07:09
    Steve Morgan
    0

    Hi Matthew,

    How many houses are listed at one time - how many content nodes do you have?

    If there's thousands rather than tens of it sounds like you're storing something that might be better handled by custom tables rather than as content nodes in Umbraco. There's an overhead in having a content node (publishing, publish history etc etc).

    For sites with lots of products / house listings / whatever this is usually the way I'd go. You can use route hijacking (custom controllers) to get your custom content to create your house listing page as a virtual node (remember to add these to your sitemap too).

    I go this way if there is a lot of change in content (CRUD) and / or if there is a lot of individual items (ecommerce products in the thousands etc).

    Obviously you might then have issues with DB overhead so you'd want caching on searches etc.

    Steve

Please Sign in or register to post replies

Write your reply to:

Draft