TechRxiv
Gradient_Des_paper.pdf (1.29 MB)
Download file

Computational Complexity of Gradient Descent Algorithm

Download (1.29 MB)
preprint
posted on 2021-05-08, 15:33 authored by Nishchal JNishchal J, neel bhandari
Information is mounting exponentially, and the world is moving to hunt knowledge with the help of Big Data. The labelled data is used for automated learning and data analysis which is termed as Machine Learning. Linear Regression is a statistical method for predictive analysis. Gradient Descent is the process which uses cost function on gradients for minimizing the complexity in computing mean square error. This work presents an insight into the different types of Gradient descent algorithms namely, Batch Gradient Descent, Stochastic Gradient Descent and Mini-Batch Gradient Descent, which are implemented on a Linear regression dataset, and hence determine the computational complexity and other factors like learning rate, batch size and number of iterations which affect the efficiency of the algorithm.

History

Email Address of Submitting Author

nishchalj.cs18@rvce.edu.in

Submitting Author's Institution

RV College of Engineering

Submitting Author's Country

India

Usage metrics

Licence

Exports