Saturday, 30 January 2016


Long ago
Even before the days when “640k should be enough for anyone”
You would run large, long-running programs as “batch jobs”
Meaning you would put them on a queue to be run, perhaps overnight.

On this particular system, there were file quotas.
When a batch job ended, your file quota would be checked.
You could exceed your quota while your job was running
But when it ended, the system would delete files if it had to,
To get you under quota.

It would start by deleting any file that was, on its own, bigger than your entire quota.
Because obviously you couldn’t get under quota while there was any such file.
Then it would delete temporary files.
Then anything that looked like a generated file,
like the output of a text formatter,
that could be regenerated if you wanted it.
It had a whole series of file types that it would look for and delete
To get you under quota.

But if you were still over quota, it would eventually delete whatever it needed to.
No file was sacred.
Except one.
The log file for the batch job you were running.
That was exempted from all these rules.
And if that file alone was bigger than your entire quota
It would begin by deleting temporary files
then generated files
then every file you owned
except that log file.
And finally, when your entire file store had been deleted
And you were still over quota
It would delete that log file.

So when you logged in the next day
all your files had vanished
and you wouldn’t even have a record of why it happened.

No comments:

Post a Comment