A robots.txt is a great tool when used correctly.
You can specify sitemaps and control the crawling rates of Google and Bing so that your crawl budget is not wasted on needless items.
While many search engines and tools will respect the robots.txt and honor not crawling other areas, it's not a failsafe, it's a request. Search engines like Yandex and many site analysis tools and AI do not respect the robots.txt guide however.
If you want to protect an area from outside access you need to password protect the area, restrict by IP or other method.
robots.txt does have it's benefits when it comes to crawling and SEO, it's just not to be used as a security blanket to block areas - that doesn't work.