Being effective in developing deep neural nets requires the selection of model hyperparameters. The process of choosing the set of optimal hyperparameters for a learning algorithm is called hyperparameter tuning and some of the common methods used include Grid Search and Random Search. We have developed a new and novel approach for hyperparameter tuning (Aisara-opti-search) which yields small loss value/high accuracy with less number of trials.