We consider two styles of executing a single job or an algorithm: either the job is subdivided into tasks, each of which is executed. on a separate processor, or the entire job is executed on a single processor, that has the same capacity as the sum of the processors in the earlier case. The algorithm is abstracted as consisting of a number of tasks with dependencies among them. Our model of dependencies among tasks allows sequential execution, parallel execution, synchronization, and spawning of tasks. The model assumes that the dependencies are known before the job begins, and a task in not preempted after its execution begins. With the usual assumptions such as exponential distribution of task execution times, and Poisson arrival of input data, we are able to show that the centralized execution completes the job faster than the decentralized execution only for a certain range of parameters of algorithms. We also give counterexamples that show that, contrary to popular belief, the reverse is true for some values of parameters of algorithms.