A simple file to allow the entire site to be crawled - no rules to ignore private pages or crawlers.

User-agent: *
Disallow:

To block on a path:

Disallow: /admin

Or the entire site:

Disallow: /

For more info: Robots guide on the robotstxt.org homepage.

Add a sitemap link:

Sitemap: https://example.com/sitemap.xml