It's just if you have 1 TB drive then you think using 5% isn't that bad.
The more tricky/interesting question is whether the increase in size brings perceptible quality increase. eg. I think some 80s stuff made with whatever low bit sampler and tiny memory, recorded on analog tape, sounds better to me than most what I've heard in last 10 years. One theory here is that people doing the 80s pop worked the stuff they had more, where now producers perhaps take some higher quality samples but don't do as much further working and time. Here's how I would imagine bean counter turned musician would think : every track sells for about the same and public may not pay much more so just pump out a lot of content to capture market share vs spending more time on one thing hoping it will be big hit. (actually that's how some music producers thought even way back - one producer under many names and labels and artists and artists having multiple brands/names.. atleast in electronic music)
Particularly interesting in 3D graphics are some games where there's plenty of screenshots taken at higher resolutions than the game originally supports and it seems the textures etc have increased in quality despite no changes being made to the textures! My theory is that either the game engine or the driver optimizations have in some cases big impact on the final quality that might've been necessary for good benchmark results at the time the game was released, but years later with more gpu power, with higher resolution rendering (scaled back to original res) is like "taking the shades off" in terms of whatever is going on affecting the image quality, reveals that the textures in the game weren't that bad after all. eg. Some Mass Effect screenshots I've seen were like if there was 2x more detail in the textures that you couldn't access.