Mathematical optimization is the process of maximizing or minimizing an objectivernfunction by _nding the best available values across a set of inputs under a restrictionrndomain. This project focuses on _nding the optimality condition for optimizationrnproblems of di_erentiable function. For unconstrained optimization problems,rnchecking the positive de_niteness of the Hessian matrix at stationary points, onerncan conclude whether those stationary points are optimum points or not, if thernobjective function is di_erentiable. For constrained Optimization problem, thernobjective function and the function in the constraint sets are di_erentiable andrnthe well known optimality condition called Karush-Kuhn-Tucker (KKT) conditionrnleads to _nd the optimum point(s) of the given optimization problem and thernconvetional Lagrangian approach to solving constrained optimization problemsrnleads to optimality conditions which are either necessary or su_cient, but notrnboth unless the underlying objective functions and functions in constraints set arernalso convex. The Tchebyshev norm leads to an optimality conditions which isrnboth su_cient and necessarly without any convexity assumption.This optimalityrnconditions can used to device a conceptually simple method for solving non-convexrninequality constrained optimization problems.