The differentiability assumptions plays a vital role in nonlinear programming, because mostrnof methods of finding the optimum point in non linear programming starts by finding therngradient of the function and then the stationary points. For unconstrained optimizationrnproblems, checking the Positive definiteness of the Hessian matrix at stationary points, one canrnconclude whether those stationary points are optimum points or not if the objective function isrndifferentiable. Similarly, if the objective function and functions in the constraint set arerndifferentiable, the well known optimality condition called Karush Kuhn Tucker (KKT)rncondition leads to find the optimum point(s) of the given optimization problem. But, sincernfinding the gradient of the function for non-differentiable functions is not possible, we treat thernproblem by finding the subgradient, the directional derivative, finding the Mordukhovichrnnormal cone depending on the convexity of the function. Consequently, the optimizationrnprocedures for the optimization problems on which functions in the problem are notrndifferentiable is different from the optimization procedures for the optimization problems inrnwhich the objective function as well as functions in constraints are differentiable. This projectrnfocuses on finding the optimality conditions for optimizations problems without anyrndifferentiability assumptions. The subgradient and directional derivative approach are used tornsolve nonsmooth optimization problem of convex type; and the Mordukhovich exremalrnprinciple is applied to solve nonsmooth optimization problems of non convex type