The researcher has made significant contributions to the development of efficient optimization algorithms for machine learning, particularly focusing on variance reduction techniques within gradient descent methods. Their work includes novel variants such as SVRG (Stochastic Variance Reduced Gradient) and SAGA (Stochastic Average Gradient Algorithm), which have improved the scalability and performance of these methods across various deep learning applications. The research also extends to constrained optimization problems, demonstrating their applicability in real-world scenarios where algorithmic efficiency is crucial for solving large-scale datasets.
All Papers
No papers found for the selected criteria.
No collaborations found in the dataset.
This profile is generated from publicly available publication metadata and is intended for research discovery purposes. Themes, summaries, and trajectories are inferred computationally and may not capture the full scope of the lecturer's work. For authoritative information, please refer to the official KNUST profile.