Do robots.txt and sitemap.xml need to be physical files?

by ag16920   Last Updated February 22, 2018 18:04 PM - source

I have both setup in my routes:

Route::get('/robots.txt', function() {
    // robots.txt contents here
});

Route::get('/sitemap.xml', function() {
    // sitemap.xml contents here
});

I can access them perfectly through the browser but I'm getting a message from Google Search Console that they are not detected. Do they need to be physical files in the root folder so they can be detected?



Related Questions


Crawling only the sitemap on google webmaster tools

Updated January 26, 2018 17:04 PM


Effect of Submitted URLs being blocked by robots

Updated September 17, 2019 11:04 AM