In my line of work there is an idea that is usually mentioned via the short-hand Big O, ... to distill it down to bare essentials, the idea is that a process/method that requires 1 unit of resource to perform 1 unit of work can easily require >2 unit of resource to perform 2 units of work, and that you must know how the process/method scales in order to optimize.