Algorithm is a step-by-step procedure to carry out a set of instructions desired order within a finite amount of space and time to get the required output. Algorithms were created separately independent of any programming language. Later can be implemented in almost all languages if required in their syntactical format to achieve their goal.

The word algorithm named after the 9th Century Persian mathematician Muhammad ibn Musa-al-Khwarizmi, Latinized Algoritmi while translating his one of his book on Arithmetic.

Algorithms are essential for computers to process data. Typically, an algorithm used to read data from an input source and then write/display to an output device efficiently and stores the data as part of the internal state of the entity in one or more Data structure for further processing.

Algorithm is the logic for solution of a problem expressed as an informal high level description as pseudocode or using a flowchart.

**Pseudocode** is an informal high-level of a computer program or other algorithm. Pseudocode omits details such as variable declarations, some subroutines that are essential for machine understanding.

**Flowchart** is a pictorial representation of an algorithm. The flowchart uses various kind of boxes to show the order and workflow by connecting with arrows.

Designing an Algorithm refers to a mathematical process for solving a problem. This involve many theories of operations and research, such as Backtracking, dynamic programming, patterns and divide-and-conquer.

Most important aspects to considered while designing an algorithm is that an algorithm should have an efficient run-time also known as **Big O (Time complexity)** and use of less memory known as **Space-Complexity**.

Time and Space Complexities used to determine the performance of an algorithm. Let’s understand what this two terms are

## Space Complexity of Algorithm

Total amount of memory space required including **Auxiliary space** and for input by an algorithm during its execution is called Space Complexity. It is important for multi-user systems and where memory is limited.

The term Space Complexity often misunderstood as Auxiliary Space. **Auxiliary Space** is the extra temporary space used by an algorithm.

**An algorithm requires space for following components:**

**Instruction Space:** This space is fixed depending upon the number of lines of code and it is the memory required to save the executable file i.e. .exe of the program.

**Data Space:** Amount of memory required to store variables including temporary variables and constants value.

**Memory Stack:** This is the space required to store the data of a **Recursion** (Function calling itself again n again) function. In recursion current variables are pushed onto the stack for further execution. After that the data stored is required to resume the suspended function.

*Note**: While calculating the Space Complexity, we consider only Data Space and neglect the Instruction Space and Stack.*

## Time Complexity of Algorithm

Total time required by the program to complete it execution with given inputs is known as Time Complexity. An efficient algorithm takes minimum time to execute the program.

Time Complexity plays an important role in deciding which algorithm should be used because there will be always more than one solution for the problem.

Time complexity of an algorithms is expressed using the big O notation. We’ll study about it in detail in the next tutorials.

For an example lets write an algorithm to calculate sum of n numbers and square of n

```
//program 1
#include<stdio.h>
int main(){
int i,sum=0;
int n=5
for (i=0;i<=n;i++)
{
sum = sum + i;
}
print("sum of numbers =%d",sum);
return 0;
}
```

```
//OUTPUT
sum of numbers = 15
```

```
//program 2
#include<stdio.h>
int main(){
int n=5,sq;
sq = n*n;
print("square of n = %d",sq);
return 0;
}
```

```
//OUTPUT
square of n = 25
```

Time Complexity mostly interpreted by counting the number of steps involve in an algorithm to finish execution.

From above example, first program uses the loop which runs n number of times to get the result and as the value of n increase the time will also increase. While the second program’s time complexity is constant, because it gives the result in single step and independent of input n.

As you know from above that an algorithms performance varies with different input, hence to determine the best we use the worst-case Time complexity (maximum time taken).

The time required by an algorithm has three cases:

**Best Case**− Minimum time required by an algorithm to execute.**Average Case**− Average time required by an algorithm to execute.**Worst Case**− Maximum time required by an algorithm to execute.

## Asymptotic Notations

While analyzing the complexity of any algorithm in terms of time and space, we can’t provide the exact number for time and the space required by the algorithm, instead we use some standard notations known as **Asymptotic Notations**.

Whenever we analyze the algorithm we get a formula representing the approximate time and space required for an execution by the computer to run certain number of line of code including variables, Constants, number of memory accesses, Comparisons, temporary variables space etc.

For an example the time complexity of certain algorithm considered as `T(n) = (n`

which is a quadratic equation.^{2} + 2n + 1)

In this example for large values of n say `n=100, (2n + 1)`

can be neglected compared to the n^{2}.

As **n=100, n ^{2}** becomes

**10000**and

**2n+1 = 201**. And also When we two different algorithms were compared constant coefficients of higher order terms(

**n**) will be neglected.

^{2}Below are three commonly used asymptotic notations for time complexity of an algorithm.

- Omega Ω Notation
- Theta θ Notation
- Big Ο Notation

### Omega Notation, Ω(n)

**Omega notation Ω(n)** express the lower bound or **best case** time complexity of an algorithm. It is the minimum time required for any algorithm to execute certain problem with given input.

In simple, the time complexity of any algorithm represented in the form of big-Ω mean that it will take atleast this much time to for execution.

For example, for a function f(n)

Ω(f(n)) ≥ { g(n) : there exists c > 0 and n0 such that g(n) ≤ c.f(n) for all n > n0. }

### Big Oh Notation, Ο(n)

Big O - Ο(n) notation used to express the upper bound or Worst Case time complexity of an algorithm. It is the maximum amount of time required to execute a program.

For example, for a function f(n)

Ο(f(n)) = { g(n) : c > 0 and n0 such that f(n) ≤ c. = g(n) for all n > n0. }

### Theta - θ(n)

The **Theta θ(n)** used to express both the lower bound and the upper bound time complexity of an algorithm. Also considered as the average time complexity of an algorithm. It is represented as

θ(f(n)) = { g(n) if and only if g(n) = Ο(f(n)) = { g(n) = Ω(f(n)) for all n > n0.

#### An efficient Algorithm must satisfy the following properties:

- The algorithm should stop after a finite number of instructions.
- Every step should be precisely stated and defined.
- Every step of an Algorithm must generate output.
- Avoid redundancy to prevent costly computations that have already been performed previously.
- Should take lesser time to execute.
- Not brute-force but a heuristic solution.