When multiple grid applications are executed on a common grid computing infrastructure, the policy of resource allocation impacts the time to complete these applications. In this paper, we formulate an analytical model that permits us to compare different allocation policies. We show that a uniform allocation policy penalizes large jobs (i.e., the work required for an application), whereas a linear allocation of resources penalizes small jobs. In particular, we study an allocation policy that aims at minimizing the average job completion time. We show that such policy can reduce the average completion time by as much as 50% of the completion time required for uniform or linear allocation policies. Using such policy is also fair to applications because it does not penalize small jobs or large jobs as other policies (such as uniform or linear) do.