Today with multi-core CPUs running at gigahertz speeds attached to gigabytes of RAM and terabytes of disc space, we're well used to running programs that require what would be enormous resources back when mainframes ruled the data centre.
My question is "how much did one second of computer time cost in 1975?" This assumes the program in question is being run on a mainframe from IBM or any of the other manufacturers of the day. Of course, there are a lot of factors to consider: leasing costs, staffing, power and cooling, if the system could run more than one program simultaneously, and how computer time was charged back to the users.
I'm interested to know if anyone on this forum has had experience with this sort of system accounting, and if they can recall some numbers.
My question is "how much did one second of computer time cost in 1975?" This assumes the program in question is being run on a mainframe from IBM or any of the other manufacturers of the day. Of course, there are a lot of factors to consider: leasing costs, staffing, power and cooling, if the system could run more than one program simultaneously, and how computer time was charged back to the users.
I'm interested to know if anyone on this forum has had experience with this sort of system accounting, and if they can recall some numbers.
via International Skeptics Forum https://ift.tt/3b8hIMK
Aucun commentaire:
Enregistrer un commentaire