If you regularly test your backups, glacier gets a lot more expensive due to egress fees
And it’s such a good thing to remind people of. You don’t manage a backup service. Nobody cares about backups.
People care about restores.
Glacier egress is less than it used to be, and you can use a combination of glacier + other storage classes using lifecycle rules.
I would personally be fine running tests on data that is recently backed up and only testing the data in glacier once or twice. Think about why you test backups in the first place—the main errors you’re trying to catch are problems like misconfiguration, backups not getting scheduled or not running, or not having access to your encryption keys. You can put your most recent backup in “infrequent access” and let older objects age out to glacier with lifecycle rules.
Glacier used to have really expensive retrieval costs. That’s now called “glacier deep archive” and as far as I know, major use cases are things like corporate recordkeeping / compliance (e.g. Sarbanes-Oxley). The costumers for deep archive should be sophisticated customers.