Publications

One can measure the importance of a scientific work by the number of earlier publications rendered superfluous by it.

David Hilbert


(Publications in chronological order)

Published Papers & Technical Reports

  1. Majid Jahani, Mohammadreza Nazari, Rachael Tappenden, Albert S. Berahas and Martin Takáč. SONIA: A Symmetric Blockwise Truncated Optimization Algorithm. 24th International Conference on Artificial Intelligence and Statistics (AISTATS), 2021, pp. 487-495. (Download PDF, Supplementary Material)

  2. Albert S. Berahas, Raghu Bollapragada and Jorge Nocedal. An Investigation of Newton-Sketch and Subsampled Newton Methods. Optimization Methods and Software, 2020, 35(4), pp. 661–680. (Download PDF, OMS Online, Supplementary Material)

  3. Albert S. Berahas and Martin Takáč, A Robust Multi-Batch L-BFGS Method for Machine Learning. Optimization Methods and Software, 2020, 35(1), pp. 191-219. (Download PDF, OMS Online, Supplementary Material)

  4. Majid Jahani, Mohammadreza Nazari, Sergey Rusakov, Albert S. Berahas and Martin Takáč. Scaling Up Quasi-Newton Algorithms: Communication Efficient Distributed SR1. 6th Annual Conference on Machine Learning, Optimization and Data Science (LOD), 2020.(Download PDF)

  5. Zheng Shi, Nur Sila Gulgec, Albert S. Berahas, Shamim N. Pakzad, Martin Takáč. Finite Difference Neural Networks: Fast Prediction of Partial Differential Equations. 2020 19th IEEE International Conference on Machine Learning and Applications (ICMLA), 2020, pp. 130-135. (Download PDF)

  6. Albert S. Berahas, Charikleia Iakovidou and Ermin Wei. Nested Distributed Gradient Methods with Adaptive Quantized Communication. 58th IEEE Conference on Decision and Control (CDC), Nice, France, 2019, pp. 1519-1525. (Download PDF, IEEE Xplore)

  7. Albert S. Berahas, Majid Jahani and Martin Takáč. Sampled Quasi-Newton Methods for Deep Learning. OPT 2019: Optimization for Machine Learning Workshop (NeurIPS 2019) (Download PDF)

  8. Albert S. Berahas, Richard Byrd and Jorge Nocedal. Derivative-Free Optimization of Noisy Functions via Quasi-Newton Methods. SIAM Journal on Optimization, 2019, 29(2), pp. 965-993. (Download PDF, SIOPT Online)

  9. Albert S. Berahas, Liyuan Cao, Krzysztof Choromanski and Katya Scheinberg. Linear interpolation gives better gradients than Gaussian smoothing in derivative-free optimization. Technical Report, Lehigh University 2019. (Download PDF)

  10. Albert S. Berahas, Raghu Bollapragada, Nitish Shirish Keskar and Ermin Wei. Balancing Communication and Computation in Distributed Optimization. IEEE Transactions on Automatic Control, 2019, 64(8), pp. 3141-3155. (Download PDF, IEEE Xplore)

  11. Albert S. Berahas, Jorge Nocedal and Martin Takáč, A Multi-Batch L-BFGS Method for Machine Learning. 2016 Advances In Neural Information Processing Systems (NeurIPS), Barcelona, Spain, Dec 2016, pp. 1055-1063. (Download PDF, Supplementary Material)

  12. Nitish Shirish Keskar and Albert S. Berahas, adaQN: An adaptive quasi-Newton algorithm for training RNNs. 2016 European Conference Machine Learning and Knowledge Discovery in Databases (ECML PKDD), Riva del Garda, Italy, Sept 2016, pp. 1-16. (Download PDF)  

  13. Michael Iliadis, Leonidas Spinoulas, Albert S. Berahas, Haohong Wang, and Aggelos K. Katsaggelos, Multi-model robust error correction for face recognition. 2016 IEEE International Conference on Image Processing (ICIP), Phoenix, Arizona, Sept 2016, pp. 3229-3233. (IEEE Xplore

  14. Michael Iliadis, Leonidas Spinoulas, Albert S. Berahas, Haohong Wang, and Aggelos K. Katsaggelos, Sparse representation and least squares-based classification in face recognition. 2014 IEEE 22nd European Signal Processing Conference (EUSIPCO), Lisbon, Portugal, Sept 2014, pp. 526-530. (IEEE Xplore)

Working/Submitted Papers

  1. (Status: Accepted, Mathematical Programming) Albert S. Berahas, Frank E. Curtis and Baoyu Zhou. Limited-Memory BFGS with Displacement Aggregation. (Download PDF)

  2. (Status: Accepted, SIAM Journal on Optimization) Albert S. Berahas, Frank E. Curtis, Daniel P. Robinson and Baoyu Zhou. Sequential Quadratic Optimization for Nonlinear Equality Constrained Stochastic Optimization. (Download PDF)

  3. (Status: Accepted, SIAM Journal on Optimization) Albert S. Berahas, Liyuan Cao and Katya Scheinberg. Global Convergence Rate Analysis of a Generic Line Search Algorithm with Noise. (Download PDF)

  4. (Status: Accepted, Foundations of Computational Mathematics) Albert S. Berahas, Liyuan Cao, Krzysztof Choromanski and Katya Scheinberg. A Theoretical and Empirical Comparison of Gradient Approximations in Derivative-Free Optimization. (Download PDF)

  5. (Status: Submitted) Albert S. Berahas, Majid Jahani and Martin Takáč. Quasi-Newton Methods for Deep Learning: Forget the Past, Just Sample. (Download PDF)

  6. (Status: Submitted) Albert S. Berahas, Raghu Bollapragada and Ermin Wei. On the Convergence of Nested Distributed Gradient Methods with Multiple Consensus and Gradient Steps. (Download PDF)

  7. (Status: Accepted, 24th International Conference on Artificial Intelligence and Statistics) Majid Jahani, Mohammadreza Nazari, Rachael Tappenden, Albert S. Berahas and Martin Takáč. SONIA: A Symmetric Blockwise Truncated Optimization Algorithm. (Download PDF)

  8. (Status: Coming soon!) Albert S. Berahas, Frank E. Curtis, Michael Mahoney, Fred Roosta and Martin Takáč. Variable-Metric Optimization Methods for Machine Learning.

  9. (Status: Coming soon!) Albert S. Berahas, Liyuan Cao and Katya Scheinberg. Convergence of Trust-Region Methods with Noise.

  10. (Status: Coming soon!) Albert S. Berahas, Oumaima Sohab and Luis Nunes Vicente. Full-Low Evaluation Methods for Derivative-Free Optimization.

  11. (Status: Coming soon!) Albert S. Berahas, Frank E. Curtis and Daniel P. Robinson. Sequential Quadratic Optimization for Nonlinear Equality Constrained Stochastic Optimization with Rank Deficient Jacobians.

  12. (Status: Coming soon!) Albert S. Berahas, Majid Jahani and Martin Takáč. Fast, Stochastic and Adaptive Sampled Quasi-Newton Methods.

  13. (Status: Coming soon!) Majid Jahani, Mohammadreza Nazari, Rachael Tappenden, Albert S. Berahas and Martin Takáč. Deterministic and Stochastic Hybrid Methods.

  14. (Status: Coming soon!) Majid Jahani, Mohammadreza Nazari, Rachael Tappenden, Albert S. Berahas and Martin Takáč. An Adaptive Stochastic Hybrid Method for Machine Learning.

Thesis

  1. Albert S. Berahas, Methods for Large Scale Nonlinear and Stochastic Optimization, March 2018, Northwestern University. (Proquest, Download PDF)