Determining what the average statistics are for a game system is pretty important. It defines the other objects in the system. I define average in this instance as around a fifty-fifty chance at accomplishing a reasonably complex task such as maintaining a computer. There are two components in this system to consider: raw natural ability (abilities) and learned/experienced knowledge (skills).
For the purposes of my percentile system, we will assume that a trained programmer has 25 in a computer programming skill and 25 in Intelligence. These would add up to a roll modifier of +50, meaning that the d100 roll needs to be 50 or higher to complete a reasonably complex program. Which means, statistically, that difficulty modifiers (situational modifiers representing the difficulty or complexity of the task) greater than +50 mean the tasks are increasingly simpler to accomplish. A difficulty modifier of +75 means that most people can accomplish the task even without training; those who have training are better at it. The total modifier for our average joe's programming skill at +75 would be +125, while somebody with only Intelligence 25 would be a total modifier of +100; the equivalent of writing a "Hello World!" program.
Conversely, difficulty modifiers below +50 mean that a task is increasingly more difficult. At +0, an average Ability + Skill of 50 has a fifty-fifty chance of success (target number 100, remember?). At -25, that change drops to one in four, while -50 drops it to one in a hundred because the roll would have to be 100 on the die.
This system doesn't have any automatic failures or automatic successes (yet... I might change my mind later). So I define the average ability score (whatever those turn out to be) as 25, and the average skill level of somebody trained as 25.