Ahmdal's law
Gene Ahmdal is one of the great pioneers of computing, most famous for formulating Ahmdal's law in a paper in 1967. The model describes how the workload for a cpu can be processed faster by adding additional cpu resources but the speedup gained is dependent on how large part of the workload can be processed in parallell. This realization has greatly facilitated discussions for decades in our relentless pursuit of decreasing the time needed to finish workloads regardless of them being performed by cpu's or humans.
A very interesting part of Ahmdal's law is that it allows calculating the maximum speedup of possible for a task if the amount of the task that can't be parallelized is known. And the findings are quite mind blowing; if there is but 10% of the work that can't be parallelized the maximum speedup you can achieve is 10. If you have a workload that you can work on in parallel to 90% but the remaining 10% can not you will only be able to perform this task 10 times faster no matter how many resources you throw at the problem. For many workloads you will have rapidly diminishing returns after adding 5-10 resources.
This is why for many ordinary tasks a 100 cpu computer wouldn't be much faster than a 10 cpu computer, and this is assuming the workload is optimized for parallel processing. This is why it frequently makes sense to try to do several different things if you have plenty of resources rather than trying having all of them work on the same thing.
In reality the same applies to various organizations, when the number of persons increase it becomes increasingly important to ensure that the teams are not limited by workloads that can not be isolated from each other, or you will be quickly noticing that workloads are not finish faster. In practice larger initiatives will require more synchronization thus leading to even larger decrease in efficiency and even later results.
Ahmdal's law does not say anything about the amount of independent workloads that can be handled with additional resources, as long as there is no external dependency (computer memory, office space) capacity to perform multiple workloads can be greatly improved by adding resources, it limits how any individual workload will not be finished radically faster.
A very interesting part of Ahmdal's law is that it allows calculating the maximum speedup of possible for a task if the amount of the task that can't be parallelized is known. And the findings are quite mind blowing; if there is but 10% of the work that can't be parallelized the maximum speedup you can achieve is 10. If you have a workload that you can work on in parallel to 90% but the remaining 10% can not you will only be able to perform this task 10 times faster no matter how many resources you throw at the problem. For many workloads you will have rapidly diminishing returns after adding 5-10 resources.
In reality the same applies to various organizations, when the number of persons increase it becomes increasingly important to ensure that the teams are not limited by workloads that can not be isolated from each other, or you will be quickly noticing that workloads are not finish faster. In practice larger initiatives will require more synchronization thus leading to even larger decrease in efficiency and even later results.
Ahmdal's law does not say anything about the amount of independent workloads that can be handled with additional resources, as long as there is no external dependency (computer memory, office space) capacity to perform multiple workloads can be greatly improved by adding resources, it limits how any individual workload will not be finished radically faster.
Gene Ahmdal
Comments
Post a Comment