Copied to clipboard

Flag this post as spam?

This post will be reported to the moderators as potential spam to be looked at


  • Andrew Martin 5 posts 25 karma points
    Jun 03, 2012 @ 22:40
    Andrew Martin
    0

    Programmatically Filtering 500+ nodes??

    Hey all,

    Using umbraco 4.7.1, I'm working on a blog site project. I did not use any starter kits, and did not put any time into learning XSLT (I want to work with strongly typed data), so all my macros are in c#.

    Everything works fine, including a search function which uses querystrings to search through all blog posts, and output that to page. This is done using nodefactory, and for simplicity's sake all blog posts are under one parent node so I can do stuff like this

    Node blogFolder = new Node(1132);
    
    var filter = (from Node node in blogFolder.ChildrenAsList 
             where (node.GetProperty("topic").Value = MyQueryString)
             orderby (DateTime.Parse(node.GetProperty("date").Value).Year) descending
                 select node)
    
    foreach (Node post in filter)
    {
        MakeHtml(post);
    }
    
    Tested this with 50 spoof posts just to test function of the different filters I'd implemented, all is well.
    BUT upon importing around 500 posts from the client's old site, I see the IIS worker process suddenly eating up 2Gb of RAM and throwing OutofMemoryException every time I make a request to the page holding the search macro. 
    I'm thinking that it's something to do with the way I'm using the NodeFactory, but I really not certain. What are the best practices for doing something like this?

    I quite literally stumped on this, any tips or what would be appreciated.

  • Rik Helsen 670 posts 873 karma points
    Jun 04, 2012 @ 09:18
    Rik Helsen
    0

    one way to make this fly is to query a search index of your website (Lucene), I think you can also optimize your query a bit, but not being a developer myself I'm quite worthless at giving advice there ;)

     

  • Andrew Martin 5 posts 25 karma points
    Jun 04, 2012 @ 11:32
    Andrew Martin
    0

    I've been digging a little deeper. It's odd, as soon I pass the threshold of exactly 108 published nodes, that is the moment any search request will cause the massive jump in processor and ram usage. It makes no sense to me at all. It doesn't make a difference which post is the 109th published, she just dies on me.

    Looking at Lucene.

  • Pedro Adão 43 posts 80 karma points
    Jun 04, 2012 @ 11:39
    Pedro Adão
    0

    Do you need all posts? Can you make a Take(10)?

    You can check your page number (?page=3) and do a Skip(10).Take(10)

  • Andrew Martin 5 posts 25 karma points
    Jun 04, 2012 @ 11:42
    Andrew Martin
    0

    Yeah I can and do just that. It is a paginated search, ten results per page. Each node has only a few strings, two int true/false, a link to a local image. 

  • Andrew Martin 5 posts 25 karma points
    Jun 04, 2012 @ 13:04
    Andrew Martin
    0

    Never mind guys. I had an infinitely executing loop somehwere in my code that I overlooked =P

    Spent three days trying to figure it out!

  • Rik Helsen 670 posts 873 karma points
    Jun 04, 2012 @ 13:08
    Rik Helsen
    0

    what speed are you getting now?

  • Andrew Martin 5 posts 25 karma points
    Jun 06, 2012 @ 13:05
    Andrew Martin
    0

    Now it is lovely and fast, like it should be. 

Please Sign in or register to post replies

Write your reply to:

Draft