I messed up! I made an offhand promise based on the fact that twitter streams are available as XML feeds and knowing how easy it is to consume RSS in umbraco. So now when I try to display our superintendent's twitter stream on our home page, I get an error parsing XSLT message. Digging a bit deeper, I realize that whenever I attempt to retrieve any XML file using GetXmlDocumentByUrl(), the script blows up if the document is located outside our local network. Then it hits me: we are behind an authenticating proxy server and there is no way around this for URL's external to our network!
My question for the umbraco gurus is this: is there a way to modify the umbraco code that retrieves the XML so that I can pass along credentials to the proxy server?
Thanks for the reply! Actually, I;m sure it isn't twitter because I tested several XML feeds and all of them failed when they were outside our network. I thought about JavaScript, but I hate to rely on a client side solution. On the other hand, given the circumstances, that may be my only option.
This particular problem won't be twitter, but if twitter goes down (which it has been known to do) then GetXmlDocumentByUrl may throw an error or delay page loading. Using javascript should get around this, but Warrens package linked above should be the easiest way to get it into umbraco.
Javascript doesn't mean inaccessible - you'd be surprised by how many people using screen readering software use normal browsers with js enabled.
Also, it's debatable if you want a few tweets on a website indexed by search engines, when they also exist on twitter.com. Also if the direct xml method causes problems when twitter etc then you've got a usability issue.
Although the js/accessibilty/progressive enhancement thread is for another time!
GetXmlDocumentByUrl and proxy servers
I messed up! I made an offhand promise based on the fact that twitter streams are available as XML feeds and knowing how easy it is to consume RSS in umbraco. So now when I try to display our superintendent's twitter stream on our home page, I get an error parsing XSLT message. Digging a bit deeper, I realize that whenever I attempt to retrieve any XML file using GetXmlDocumentByUrl(), the script blows up if the document is located outside our local network. Then it hits me: we are behind an authenticating proxy server and there is no way around this for URL's external to our network!
My question for the umbraco gurus is this: is there a way to modify the umbraco code that retrieves the XML so that I can pass along credentials to the proxy server?
If you're using getDocument, error may be thrown if (when) twitter goes down for a bit. Maybe best to use javascript?
http://www.webresourcesdepot.com/3-jquery-twitter-plugins-adding-tweets-to-your-website/
And if http://our.umbraco.org/projects/twitter-for-umbraco is like any of the Authors other projects it should work fairly seamlessly.
Thanks for the reply! Actually, I;m sure it isn't twitter because I tested several XML feeds and all of them failed when they were outside our network. I thought about JavaScript, but I hate to rely on a client side solution. On the other hand, given the circumstances, that may be my only option.
This particular problem won't be twitter, but if twitter goes down (which it has been known to do) then GetXmlDocumentByUrl may throw an error or delay page loading. Using javascript should get around this, but Warrens package linked above should be the easiest way to get it into umbraco.
Using javascript it won't be accessible and can't be indexed by searchengines.
If you want I can take a look at it.
Do you have an example on how your proxy needs to be approached?
Ron
Javascript doesn't mean inaccessible - you'd be surprised by how many people using screen readering software use normal browsers with js enabled.
Also, it's debatable if you want a few tweets on a website indexed by search engines, when they also exist on twitter.com. Also if the direct xml method causes problems when twitter etc then you've got a usability issue.
Although the js/accessibilty/progressive enhancement thread is for another time!
*reading, lol.
@dandrayne
I totaly agree on you on:
"Also, it's debatable if you want a few tweets on a website indexed by search engines, when they also exist on twitter.com."
That depents on what the clients wants.
But i do not agree on:
"you'd be surprised by how many people using screen readering software use normal browsers with js enabled."
I know that lot's of people use normal browsers (our company is specialized and even offers software for that) however also many people do not.
And LOL, I agree that the js/accessibilty/progressive enhancement thread is for another time! :p
is working on a reply...