I am working with a webservice integration to Umbraco.. There is around 5000 products come from an external webservice as an XML structure and this need to be compared with the corresponding nodes available in the Umbraco and create new nodes (if not already available) or update the nodes..
This need to be run in every 10 - 20 minutes as the external product database changes very frequently.. I have written a razor script which do LINQ join on the Webservice XML and Node data and perform all necessary actions.. But its extremely slow (takes more than 1 hour of execution) and not an acceptable solution..
Is there any suggestion to do this operation in a better/effcient way...
I was wondering if its possible to create/update the nodes directly into DB by doing an INSERT/UPDATE queries.. rather than using the Umbraco APIs in razor.. so we can write necessay Stored Procedures and call from a server scheduled task or something..
The Umbraco API can be pretty slow. Doing this directly into the DB won't be easy. Best solution is probably to store all the products in a custom table with a custom section instead of using nodes. If you need to run it every 10-20 min that's probably the only way.
The problem with custom tables and sections is - we need the product data to be used in Teacommerce
As far as I know Teacom needs the products to be created as Umbraco nodes..I will check with Teacom to see if custom table can be used with Teacommerce APIs instead of Umbraco nodes.
Correct me if I'm wrong here, but the import of the 5000+ products will then be a one-time case for actually importing the products and creating them as nodes, yes? If that's the case, I would do a one-time import and then use either the XML cache (nodefactory) og Examine (prefered, performancewise) to perform the lookup for the documents/nodes that needs to be updated.
Just make sure that your Umbraco backend is well structured then as opening a folder with 5000+ documents in it can be a slow task ;-)
So basically the scenario will be:
- Do a one-time, long running import
- Create a service that performs CRUD operations based on either the XML cache or Examine.
- Run the service as a scheduled task ever N minutes.
With 5000+ nodes I *think* the XML cache would be a bit slow (not entirely sure here as I've never worked with that amount of documents), but at least it's not causing a database hit for every node. I think Examine/Lucene would be a good fit to search through the documents to determine which CRUD operation needs to be performed.
Let me know if any of the above doesn't make sense ;-) It's late here, after all.
By the way, I would not recommend trying to perform CRUD operations directly against the database. That's just prone to errors since everything is being cached, versioned and so on, so forth.
Insert/Update nodes directly in Database
I am working with a webservice integration to Umbraco.. There is around 5000 products come from an external webservice as an XML structure and this need to be compared with the corresponding nodes available in the Umbraco and create new nodes (if not already available) or update the nodes..
This need to be run in every 10 - 20 minutes as the external product database changes very frequently.. I have written a razor script which do LINQ join on the Webservice XML and Node data and perform all necessary actions.. But its extremely slow (takes more than 1 hour of execution) and not an acceptable solution..
Is there any suggestion to do this operation in a better/effcient way...
I was wondering if its possible to create/update the nodes directly into DB by doing an INSERT/UPDATE queries.. rather than using the Umbraco APIs in razor.. so we can write necessay Stored Procedures and call from a server scheduled task or something..
Any help on this will be greatly appreciated..
Thanks in advance..
The Umbraco API can be pretty slow. Doing this directly into the DB won't be easy. Best solution is probably to store all the products in a custom table with a custom section instead of using nodes. If you need to run it every 10-20 min that's probably the only way.
Jeroen
Thank you for your comment..
The problem with custom tables and sections is - we need the product data to be used in Teacommerce
As far as I know Teacom needs the products to be created as Umbraco nodes..I will check with Teacom to see if custom table can be used with Teacommerce APIs instead of Umbraco nodes.
Thanks again :)
Hi Anz,
Correct me if I'm wrong here, but the import of the 5000+ products will then be a one-time case for actually importing the products and creating them as nodes, yes? If that's the case, I would do a one-time import and then use either the XML cache (nodefactory) og Examine (prefered, performancewise) to perform the lookup for the documents/nodes that needs to be updated.
Just make sure that your Umbraco backend is well structured then as opening a folder with 5000+ documents in it can be a slow task ;-)
So basically the scenario will be:
- Do a one-time, long running import
- Create a service that performs CRUD operations based on either the XML cache or Examine.
- Run the service as a scheduled task ever N minutes.
With 5000+ nodes I *think* the XML cache would be a bit slow (not entirely sure here as I've never worked with that amount of documents), but at least it's not causing a database hit for every node. I think Examine/Lucene would be a good fit to search through the documents to determine which CRUD operation needs to be performed.
Let me know if any of the above doesn't make sense ;-) It's late here, after all.
All the best,
Bo
By the way, I would not recommend trying to perform CRUD operations directly against the database. That's just prone to errors since everything is being cached, versioned and so on, so forth.
is working on a reply...