I'm currently building a multi-lingual umbraco site (4.7.2). The structure I have adopted is as follows:
Content -Root -en -de -sk -cz
I've used a 'Root' page which is a blank template/doctype becuase I need to have specfici /xx domain urls for each country. Without the 'Root' content tree node, the default site served was 'en', but without the /en in the url.
Having the 'Root' node gets round this, and also is a good place for our back-end dev's to place their geo-location logic which will redirect you to the relevant /xx.
However, I cant figure out what effect this will have on SEO? My current thinking is that our geo-location logic will either not work with a bot request OR it will geolocate the bot somewhere other than the five countries I have sites for (US perhaps), in which case our logic will serve the English site by default, and that will therefore be indexed without the other 4 countries.
I imagine a way round this would be the have the 'Root' page actually be a working page that perhaps has links to all 5 countries which the bot could therefore crawl and index in a normal fashion. But, we need our geolocation to be automatic which is why the Root page is simply a placeholder, it doesn't serve anything to users becuase they'll never see it.
I have seen that you can use a 'Set geographic target' setting in Google's webmaster tools, but my worry is that becuase our 'Root' page automatically geolocates users, there will be no way for the bot to ever crawl the oither 4 sites anyway.
Apologies for the convoluted nature of the question, but its the best way I could explain it.
Multi-lingual sites and SEO
Hi there,
I'm currently building a multi-lingual umbraco site (4.7.2). The structure I have adopted is as follows:
I've used a 'Root' page which is a blank template/doctype becuase I need to have specfici /xx domain urls for each country. Without the 'Root' content tree node, the default site served was 'en', but without the /en in the url.
Having the 'Root' node gets round this, and also is a good place for our back-end dev's to place their geo-location logic which will redirect you to the relevant /xx.
However, I cant figure out what effect this will have on SEO? My current thinking is that our geo-location logic will either not work with a bot request OR it will geolocate the bot somewhere other than the five countries I have sites for (US perhaps), in which case our logic will serve the English site by default, and that will therefore be indexed without the other 4 countries.
I imagine a way round this would be the have the 'Root' page actually be a working page that perhaps has links to all 5 countries which the bot could therefore crawl and index in a normal fashion. But, we need our geolocation to be automatic which is why the Root page is simply a placeholder, it doesn't serve anything to users becuase they'll never see it.
I have seen that you can use a 'Set geographic target' setting in Google's webmaster tools, but my worry is that becuase our 'Root' page automatically geolocates users, there will be no way for the bot to ever crawl the oither 4 sites anyway.
Apologies for the convoluted nature of the question, but its the best way I could explain it.
Does anyone have any advice of doing this?
Thanks in advance, Phil
is working on a reply...