Are the URLs yours? If so, Google Analytics or Looker Studio are going to be potential options. There's also the Search Console API (which I've not used) which I believe you can use Python to create a batch script and bang it against the system.
If they're your URLs you can use something like screamingfrog.co.uk (paid version) to sync Google Analytics and Search Console, and then query either the entire site or segments of it. You can set date ranges for where you want to extract the data and it will do a pretty good job.
If the URLs are not yours, then you're in to looking at things like ahrefs, semrush, moz and their tools for scanning a site and making estimates.