Conventional economics says money today is worth more than money in the future. This is why people are willing to agree to pay interest on a loan (and why a creditor requires it). How much more money is worth today than in the future is determined by the discount or interest rate (depending on what kind of calculation you’re doing). This would hold true, say the economists, even if we lived in a hard money world (e.g. silver and gold), and even after accounting for the risk of default by the debtor, because of opportunity costs. Creditors and investors presumably have a choice as to what they do with their money. Sitting on your pile of treasure in a vault ensures that it doesn’t get smaller, but it also doesn’t get bigger. When they choose to make a loan or invest in an enterprise, they are, it is assumed, seeking the best possible (risk adjusted) return, and so the value of a given present pile of money at some time in the future is the principal invested plus the return earned between now and then. If you can make 10% per year on some investment, and you have $100, and someone offers to give you $105 a year from now in exchange for your $100 now, all else being equal, you refuse, invest at 10%, and end up with $110 next year instead.
This conception of money is somewhat problematic, as it tends to render everything in the time and world of your grandchildren essentially worthless in the present. Even at a modest 5% discount rate, $100 a century from now is only worth $0.59 today. I think the problem comes largely from the convolution of informational and material wealth, and our habit of representing both of them with the same currency.
Informational wealth does broadly behave the way we’ve constructed money. A particular piece of knowledge is very likely to be worth more today than a year from now. Knowing how to build an atomic bomb was much more valuable in 1939 than in 1945, and the depreciation has only continued. Tehcnology for building micron-scale integrated circuits would have been fabulously valuable in 1963, but it’s essentially worthless now. Information gets old. It goes stale.
Material wealth, in contrast, behaves in the opposite way. We can get much more destruction out of a given mass of plutonium today than we could in 1945. The processor in my laptop probably uses less silicon than an Intel 386, despite housing many orders of magnitude more computational power. Over time we get better at extracting a given amount of utility from a given lump of stuff.
Representing both kinds of wealth with the same currency is crazy. It encourages the profligate use of finite material goods today that will be worth much more tomorrow, instead of focusing our attention on improving our ability to utilize those goods, which is the only real economic growth. Knowledge, information, and technology can grow almost forever (assuming for the moment we don’t do anything stupid with them), but there’s only so much coal in the ground and iron in the mountain. We can only precipitate a certain quantity of salt into the soils of the San Joaquin Valley before it is rendered useless for agriculture. Why not try and maximize the amount of produce we ultimately get from the land, by minimizing the water required per unit food produced, and avoiding the application of salts unnecessarily?
We need to somehow arrange our economy such that material wealth has a negative discount rate, and informational wealth has a positive discount rate, so that the growth and depreciation rates must at least balance each other out for a transaction to make economic sense. In order to do this without reverting to barter, we need two different currencies which cannot be interchanged.