I am using a "cloud" hosting provider (Heroku) to host my webapp. Since I don't have access to a permanent file system, I am storing my sitemap.xml in Amazon S3.
I wanted to know the SEO implications of the following 2 options for submitting the sitemap to search engines (Google & Bing) via their webmaster tools:
1) Create on endpoint on my domain: http://mydomain.com/sitemap.xml that performs a 301 redirect to the S3 hosted sitemap. Provide the url hosted on my domain to the search engines. This is the option I am currently using. It seems to work fine with Google, but I noticed a "sitemap error" with Bing - I am monitoring this as I am not yet sure what the cause is.
2) Apparently, there is a way to do "cross domain" sitemap submission whereby I get the S3 URL approved by the search engine, then I can directly submit the S3 url as my sitemap.
Also I am currently pointing the sitemap entry in robots.txt to the sitemap url hosted on my domain (not to S3).
Is one of these methods preferred from an SEO perspective? Like I said, I am using option (1) but I want to be somewhat confident that the crawlers will be OK with the HTTP 301 that I'm using.
Google seems to be much more forgiving of cross-domain sitemaps that Bing. Bing does support cross-domain sitemaps, but with a bunch of caveats.
According to that document, to get your cross domain sitemap to work with Bing, you should link to the other domain in your robots.txt file rather than linking to the URL on your site that then redirects.