# Running Time, Growth of Function and Asymptotic Notations

## Introduction

Writing a computer program that handles a small set of data is entirely different than writing a program that takes a large number of input data. The program written to handle a big number of input data MUST BE algorithmically efficient in order to produce the result in reasonable time and space. In this article, I discuss some of the basics of what is the running time of a program, how do we represent running time and other essentials needed for the analysis of the algorithms. Please bear with me the article is fairly long. I promise you will learn quite a few concepts here that will help you to cement a solid foundation in the field of design and analysis of algorithms.

## Running Time

Suppose you developed a program that finds the shortest distance between two major cities of your country. You showed the program to your friend and he/she asked you “What is the running time of your program?”. You answered promptly and proudly “Only 3 seconds”. It sounds more practical to say the running time in seconds or minutes but is it sufficient to say the running time in time units like seconds and minutes? Did this statement fully answer the question? The answer is NO. Measuring running time like this raises so many other questions like

• What’s the speed of the processor of the machine the program is running on?
• What is the size of the RAM?
• What is the programming language?
• How experience and skillful the programmer is?
• and much more

In order to fully answer your friend’s question, you should say like “My program runs in 3 seconds on Intel Core i7 8-cores 4.7 GHz processor with 16 GB memory and is written in C++ 14”. Who would answer this way? Of course, no one. Running time expressed in time units has so many dependencies like a computer being used, programming language, a skill of the programmer and so on. Therefore, expressing running time in seconds or minutes makes so little sense in computer programming.

You are now convinced that “seconds” is not a good choice to measure the running time. Now the question is how should we represent the running time so that it is not affected by the speed of computers, programming languages, and skill of the programmer? In another word, how should we represent the running time so that we can abstract all those dependencies away?. The answer to the question is simple which is “input size”. To solve all of these dependency problems we are going to represent the running time in terms of the input size. If the input size is $n$ (which is always positive), then the running time is some function $f$ of $n$. i.e.
$$\text{Running Time} = f(n)$$
The functional value of $f(n)$ gives the number of operations required to process the input with size $n$. So the running time would be the number of operations (instructions) required to carry out the given task.
Function $f(n)$ is monotonically non-decreasing. That means, if the input size increases, the running time also increases or remains constant. Some examples of the running time would be $n^2 + 2n$, $n^3$, $3n$, $2^n$, $\log n$, etc. Having this knowledge of running time, if anyone asks you about the running time of your program, you would say “the running time of my program is $n^2$ (or $2n$, $n\log n$ etc)” instead of “my program takes 3 seconds to run”.

The running time is also called a time complexity

### What exactly is the input size?

Input size informally means the number of instances in the input. For example, if we talk about sorting, the size means the number of items to be sorted. If we talk about graphs algorithms, the size means the number of nodes or edges in the graph.

## Growth of Functions

In the previous section, I said that the running times are expressed in terms of the input size ($n$). Three possible running times are $n$, $n^2$ and $n^3$. Among these three running times, which one is better? In other words, which function grows slowly with the input size as compared to others? To find this out, we need to analyze the growth of the functions i.e we want to find out, if the input increases, how quickly the running time goes up.

One easiest way of comparing different running times is to plot them and see the natures of the graph. The following figure shows the graphs of $n$, $n^2$ and $n^3$. (x-axis represents the size of the input and y-axis represents the number of operation required i.e. running time)

Looking at the figure above, we can clearly see the function $n^3$ is growing faster than functions $n$ and $n^2$. Therefore, running time $n$ is better than running times $n^2$, $n^3$. One thing to note here is the input size is very small. I deliberately use the small input size only to illustrate the concept. In computer science especially in the analysis of algorithms, we do the analysis for very large input size.

Another way of checking if a function $f(n)$ grows faster or slower than another function $g(n)$ is to divide $f(n)$ by $g(n)$ and take the limit $n \to \infty$ as follows
$$\lim_{n \to \infty}\frac{f(n)}{g(n)}$$

If the limit is $0$, $f(n)$ grows faster than $g(n)$. If the limit is $\infty$, $f(n)$ grows slower than $g(n)$.

The table below shows common running times in algorithm analysis. The entities in the table are presented from slower to quicker (best to worst) running times.

Running TimeExamples
Constant1, 2, 100, 300, …
Logarithmic$\log n$, $5\log n$, …
Linear$n$, $n + 3$, $2n + 3$, …

## References

1. Brassard, G., & Bratley, P. (2008). Fundamentals of Algorithmics. New Delhi: PHI Learning Private Limited.
2. Cormen, T. H., Leiserson, C. E., Rivest, R. L., & Stein, C. (n.d.). Introduction to algorithms (3rd ed.). The MIT Press.