Are there advantages to listing XML sitemap files in robots.txt vs submitting them directly to Google via Webmaster Tools?
It seems these days that a sitemap in the root named sitemap.xml will get picked up by dang near anyone in time.
It all depends on how much control you want.
If you do not want just anyone reading your sitemap including scrapper bots, then name it something unique and submit it to Google, Bing, and who ever else you want. Do not put it in your robots.txt file.
If you are not worried about who reads your sitemap, it is mostly enough to just create a sitemap file in the root named sitemap.xml. Bing, Yandex, and Baidu all found mine just fine without anything in the robots.txt file. However, if you want it more widely known and more easily picked up, then use the robots.txt file. I would still submit it to Google and Bing manually. That way you can see some info about how many pages have been indexed from the sitemap etc. I do not think that Google specifically looks for a sitemap if one was not submitted. Things may have changed recently where they do.
Also consider that some of these domain stat sites look for sitemaps and will report it. So if you make it publicly known, it will really be publicly known to anyone including script kiddies. There are scrapper bots reading the sitemaps and spidering sites.
While I do not have anything in my robots.txt, my sitemap file is in the root and named sitemap.xml. It is being reported on domain stat sites. I generally do not have any regrets over this, but once the genie is let out of the bottle, it is hard to stuff back in. Just keep that in mind.