Imagine you could pay with your computer resources. So there might be something you’d like to buy, such as a course.
You happen to have solar panels, or other forms of renewable electricity, or perhaps you simply don’t pay an electricity bill. Or you are in a place or living situation where it is difficult to acquire fiat currency, but you have a computer.
In this case you would download a special app, it would operate in a sandbox, where it can’t see anything on your computer or internet connection, but can do it’s own thing either compiling or evolving programs for example, until it reaches to wattage limit you set, or gets your account into good standing.
The app or electricity based currency exchange would be pegged to some reasonable average of global electricity prices. So it may even be lucrative for you to run this app if you happen to be in a country or have electricity costs which are lower than average. It would also make sense to run this app if you are off-grid and have surplus electricty which can’t be stored,
in which case it can be assigned to and used by the apps.
Of course, for an average 100w computer, you’d only be spending 2.4kW, which if international prices are around 20 cents per kW, then 40.8 cents a day. However humans don’t have to work 100% of the day to get a living wage. Most working-class humans get roughly a 60-80% bonus (they don’t have to work for 60-80% of the time and still meet their bills). So it makes sense to give the same bonus to these computers.
Now that would be a $1 per day, or $30 a month, or $365 a year, enough to keep the computer in good repair. If it had a powerful GPU card, that performed at an estimated 500Watts, then that is almost $2,000 a year, enough for some serious upgrades, even a whole new computer.
If the online course only cost $30, then could pay it off within a month, simply by keeping your computer on.
Various cloud services could be hosted on these computers, especially those without high performance compatible GPU units.