# C# .NET - Looping structure - Asked By Neha . on 16-Jul-11 01:06 PM

Can anyone please show step by step way to find time complexities for the following loops.

`a.`

`for``(cnt 1=0,i=1;i<=n;i++)`
`for``(j=1;j<=n;j++)`
`cnt 1++;`

`b.`

`for``(i=0;i<n;i++)`
`for``(j=0;j<n;j++)`
`for``(k=a[i][j]=0;k<n;k++)`
`a[i][j]=b[i][k]*c[k][j];`

Devil Scorpio replied to Neha . on 16-Jul-11 01:54 PM
Hi Neha,

calculating time complexity of loop quite difficult. Hope the below article will help u to understand how to calculate time complexity.

To find the `T(n)` I looked at the values of `j` and looked at how many times the while loop executed:

``````values of J:   2, 3, 4, ... n
Loop executes: 1, 2, 3, ... n
``````

Analysis:
Take the summation of the while loop executions and recognize that it is `(n(n+1))/2` I will assign this as my `T(n)` and prove it is tightly bounded by `n^2`. That is `n(n+1)/2= θ(n^2)`

Scratch Work:

``````Find C1, C2, no є R+ such that 0 ≤ C1(n^2) ≤ (n(n+1))/2 ≤ C2(n^2)for all n ≥ no

To make 0 ≤ C1(n) true, C1, no, can be any positive reals
To make C1(n^2) ≤ (n(n+1))/2, C1 must be ≤ 1
To make (n(n+1))/2 ≤ C2(n^2), C2 must be ≥ 1
``````

PF:

``````Find C1, C2, no є R+ such that 0 ≤ C1(n^2) ≤ (n(n+1))/2 ≤ C2(n^2) for all n ≥ no
Let C1= 1/2, C2= 1 and no = 1.
``````
1. show that 0 ≤ C1(n^2) is true C1(n^2)= n^2/2
n^2/2≥ no^2/2
⇒no^2/2≥ 0
1/2 > 0
Therefore C1(n^2) ≥ 0 is proven true!

2. show that C1(n^2) ≤ (n(n+1))/2 is true C1(n^2) ≤ (n(n+1))/2
n^2/2 ≤ (n(n+1))/2 n^2 ≤ n(n+1)
n^2 ≤ n^2+n
0 ≤ n

This we know is true since n ≥ no = 1
Therefore C1(n^2) ≤ (n(n+1))/2 is proven true!

3. Show that (n(n+1))/2 ≤ C2(n^2) is true (n(n+1))/2 ≤ C2(n^2)
(n+1)/2 ≤ C2(n)
n+1 ≤ 2 C2(n)
n+1 ≤ 2(n)
1 ≤ 2n-n
1 ≤ n(2-1) = n
1≤ n
Also, we know this to be true since n ≥ no = 1

Hence by 1, 2 and 3, θ(n^2 )= (n(n+1))/2 is true since
0 ≤ C1(n^2) ≤ (n(n+1))/2 ≤ C2(n^2) for all n ≥ no

Reference :- http://stackoverflow.com/questions/656745/how-do-i-find-the-time-complexity-tn-and-show-that-it-is-tightly-bounded-big-t
Radhika roy replied to Neha . on 17-Jul-11 02:15 AM
The time complexity is a measure of how the execution time varies with the amount of data to be processed. In your example, the amount of data to be processed is indicated by n. In your example you have two nested loops.

In its first time through, the inner loop runs n-1 times (j going from 0 up to, but not including n-1). The second time through, it runs n-2 times, then n-3 times, and so on until, on the last entry of the inner loop, when i is n-2, and inner loop runs only once. So the instructions inside the inner loop are executed n-1 + n-2 + n-3 + ... + 1 times. This is close enough to the sum of all numbers from 1 to n that we can use the formula for that sum. The sum of all numbers from 1 to n is (n^2 + n)/2. When finding the time complexity, we focus on the highest order term only, and ignore all constants. In this case we focus only on the n^2 term and the time complexity of this algorithm is O(n^2). This means that the time to process the data varies as the square of the amount of data. You can see that we are not looking for some exact number, only a general trend, which is why it is ok to approoximate the sum from 1 to n-1 using the formula for the sum of 1 to n.

http://en.allexperts.com/q/C-1587/2011/5/Find-time-complexity-given.htm

Reena Jain replied to Neha . on 17-Jul-11 01:01 PM
Hi,

one way quick way to get a big O notation is to look at nested for loops.

Typically (but not always) one loop nested in another will cause O(n^2).

Think about it, the inner loop is executed i times, for each value of i. The outer loop is executed n times.

thus you see a pattern of execution like this: 1 + 2 + 3 + 4 + ... + n times

Therefore, we can bound the number of code executions by saying it obviously executes more than n times (lower bound), but in terms of n how many times are we executing the code?

Well, mathematically we can say that it will execute no more than n^2 times, giving us a worst case scenario and therefore our Big-Oh bound of O(n^2). (For more information on how we can mathematically say this look at the http://math2.org/math/expansion/power.htm)

Big-Oh doesn't always measure exactly how much work is being done, but usually gives a reliable approximation of worst case scenario.