I just looked it up and apparently, all tiles would take up 54 TB[0], and even just the important tiles are around 1TB. I now understand why just downloading them isn't an option.
It specifically says: OSM does NOT pre-render every tile. Pre-rendering all tiles would use around 54 TB of storage
that 54TB figure is only if you pre-render into raster every zoom level of every tile for the whole world. a more typical self hosted openstreetmap renders the tiles on demand as you scroll or move the map.
Disk space usage is MUCH less, such that my test/development setup fits well on a virtual machine with 300GB of disk space on a 12TB hard drive that cost $145. By this measurement I believe it's using just under 4 dollars of disk space, 8 dollars if you count its duplicate where it gets backed up to.
that would be a bad idea since the tiles are ever changing. the master copy of the data is in a vector format. the idea is you want to run the (free) software engine that turns the vector data into various zoom levels of raster tiles, on demand. the vector data is really quite small, an entire large-population US state with many complex cities is under a few hundred megabytes.
[0] https://wiki.openstreetmap.org/wiki/Tile_disk_usage