Before moving forward I have to tell you some facts in regards to this complexity thing. Here they are:
 There are different ways of representing the complexity of the algorithm, Big Oh notation is one of them, Asymptotic Notation is another one.
 When you want to calculate the complexity of the algorithm the basic steps are considered to take the unit cycle to complete and thus is ignored in process. For example, a = b+c; is not going to be considered as a part of the complexity since in any CPU, it is considered to take 1 CPU cycle to complete itself.
 Most important, only loops are considered while actually deriving the order of complexity.
Now as we know the basic stuff of complexity, let us now jump in to the pool of complexity. :D
1. Let us start with a very basic example of adding 2 numbers.
public int fun(final int a, final int b){
return a+b;  A
}
In this program the equation A is considered to be taking 1 CPU cycle and that's why the Order of complexity in case of this program is
0(1).
2. Now let us take another example of printing n numbers given by user.
public void fun(int n){
for(int i=0; i<=n; i++){
System.out.println((i+1));B
}
}
Now in this case, there is a loop and our main purpose of calculating the order is going to be entertained here.The equation B is considered to be the one that runs for n number of times where n is the number passed to the function. So the order would be
O(n).
3. Now let us take an example of same loop but the increment is little changed.
public void fun(int n){
for(int i=0; i<=n; i+=2){
System.out.println((i+1));C
}
}
In this case the equation C is going to run for
n/2 number of times, since the increment is going for +2 times meaning it is actually cutting out the total iterations in half, so in this case the order of the algorithm would be
O(n/2).
4. Now lets change the conditional part to a different mode.
public void fun(int n){
for(int i=0; i<=n^2; i+=2){
System.out.println((i+1));D
}
}
In this case the equation D is going to run for
n/2 number of times till
n^{2} , since the increment is going for +2 times meaning it is actually cutting out the total iterations in half but running till
n^{2}, so in this case the order of the algorithm would be
O(n^{2}/2).
I hope I am making enough explanation of how the complexity is calculated. It will be more clear once you see more examples going forward.
5. So coming back to the complexity lets now take an example of 2 loops.
public void fun(int n){
for(int i=0; i<=n; i++){
for(int j=0;j<=n;j++){
//E
}
}
}
This equation E is the part where your logic is going to work. So in this case the outer loop of i will run for n number of times and for each iteration of i, the inner loop executes for n number of times. So if i want to draw a chart of number of iterations it would be:
i 
E executed 
1 
n 
2 
n 
...n 
n 
Now in this case, total number of times the equation E runs would be, n+n+n..n number of times, which is n*n, and so the order would be
O(n^{2}).
6. Now taking an example of 2 loops where 1 loop is dependent on other 1
public void fun(int n){
for(int i=0; i<=n; i++){
for(int j=0;j<=i;j++){
//F
}
}
}
In this case if you look closely the inner loop is dependent on the outer loop. So if I have to create a table that will give info regarding how many loops it might run :
i 
F executed 
1 
1 
2 
2 
...n 
n 
This means the equation F or in other words the inner loop is executing
1 + 2 + 3 + .... + n number of times which is an
Arithmetic Progression or sum of n natural numbers, whose formula is
[(n)*(n+1)] / 2 which on expansion will give
[n^{2} + n ]/2. Now for mathematical students the order of an equation is the number of the highest derivative in a differential equation.
So the order of this program is
O(n^{2}).
You might ask why we just considered the
n^{2} term and not the
n in that
(n^{2} + n)/2. The reason is that in case of large values of
n , the
part n is not going to make a much difference and when the order of any algorithm is calculated only the highest degree of the equation is taken into consideration.
7. Taking an example of loops with a condition when variable increments drastically
public void fun(int n){
for(int i=0; i<=n; i=i*2){
//G
}
}
This part is a bit tricky and please give some attention to it. I am going to give you the idea how the complexity in these types are calculated. So for every iteration i is going to get 2 times. So if I have to view the growth then the growth would be
2^k till
i<=n is not achieved, where
k is arbitrary constant till the condition is met. So if I have to show this mathematically it will be :
Mathematical Explanation for order 
n = 2^k 
Taking log both sides
log n = log 2 + log k 
If log is taken in base 2, then RHS wold be 1 + log_{2} k 
So the
LHS would be
log_{2} n and
RHS would be constant and thus the order would of the above code would be
log_{2}n which in one way or the other says that if the increment is done in multiples, the order is going to be in terms of log, which says that the algorithm will be pretty efficient.
8. Taking an example of loops with a condition when variable increments drastically and the condition is in orders
public void fun(int n){
for(int i=0; i<=n^2; i=i*2){
//H
}
}
This case is pretty interesting to understand and I will leave that to you guys to think on that, but let me tell you the order in this. If you see the loop structure is same except the condition is going till n
^{2}. So for this case the order will
n^{2}log_{2}n. This part is pretty confusing but think that the overall loops runs for the same amount till n
^{2} is met so log
_{2} remain same and it will incremented by n
^{2} times.
9. Taking an example of loops where variable increments in powers
public void fun(int n){
while(n>k){//k is any constant
n = n^{1/2}//in simple words square root of 2
}
}
In this case the decrement is happening square root times for every iteration. This loop keeps on going and for 1st iteration n becomes
^{2}
√n
, for 2nd n becomes
^{4}
√n
, 3rd n becomes
^{8}
√n
and so on. Loop will go on till n^1/x reaches k as per the condition.So if I have to show this mathematically it will be :
Mathematical Explanation for order 
n^{1/2x} = k 
Taking log both sides with base k
1/2^{x} X log_{k} n = 1 
log_{k} n = 2 ^{x}

Taking log both sides again with base 2
log_{2}log_{k}n = log_{2} 2 + log_{2} x 
log_{2}log_{k} n = 1 + some constant or in other words this is the order

This gives the order as
log_{2}log_{k}n . This part is a bit tricky and therefore please give some attention to it.
Now coming to the end part I am going to summarize the things in a pretty simple manner and you guys can easily understand how the order of the algorithm is calculated.
Now let me tell you very basic rule that
when the
value increments the condition will always (most of the times) is less than in nature and vice versa. If you see our examples it is pretty evident that things move this way only. Also
when the value decrements, the condition will be greater than in nature. So I am going to sum up the whole discussion in a table that will help you to understand the extraction of order for the algorithm.
I am going to make a table that have three columns, namely the increment part, decrement part and order it would project. So here it goes.
Summary of the order 
Increment 
Decrement 
Order Would be 
(n+x)  (nx)  O(n/x) 
(n*x)  (n/x)  O(log_{x} n) 
(n^{x})  (n^{1/x})  O(log_{x}log n) 
This order will be multiplied by the end condition of the loop. So if the loop is going to run till
n^{2}, the order would be
O(n^{2}log_{x}n). This case can be understand by
point number 8.
Note: In most of the cases we never mention the bases since it is constant in nature and so does not going to effect in large values of n. So order is written as O(log log n).