I'm not old enough to be an expert, just old enough to have used the leftovers (new computers were too expensive!): many many devices were uppercase only, card punches, ASR-33 teletype, the Telex system, lineprinters, "glass" teletypes, FORTRAN, COBOL, and now that I think of it, Morse code/telegraph had always been. There was a ton of infrastructure that was uppercase only. You may be right that it wasn't ASCII's fault. Perhaps the first version of ASCII made sure to encompass what had been, and then saner heads said "let's allow for future progress".
i'm not going to explore the entire history, but just looked this up. TL;DR example, the addition of lowercase characters represented a jump from 6 bits to 7 bits at the hardware level:
"A six-bit character code is a character encoding designed for use on computers with word lengths a multiple of 6. Six bits can only encode 64 distinct characters, so these codes generally include only the upper-case letters, the numerals, some punctuation characters, and sometimes control characters. The 7-track magnetic tape format was developed to store data in such codes, along with an additional parity bit."
i'm not going to explore the entire history, but just looked this up. TL;DR example, the addition of lowercase characters represented a jump from 6 bits to 7 bits at the hardware level:
"A six-bit character code is a character encoding designed for use on computers with word lengths a multiple of 6. Six bits can only encode 64 distinct characters, so these codes generally include only the upper-case letters, the numerals, some punctuation characters, and sometimes control characters. The 7-track magnetic tape format was developed to store data in such codes, along with an additional parity bit."
https://en.wikipedia.org/wiki/Six-bit_character_code