Publications

One can measure the importance of a scientific work by the number of earlier publications rendered superfluous by it.

David Hilbert


(Publications by year)

Published Papers

Journal Publications
  1. Albert S. Berahas, Frank E. Curtis, Michael J. O’Neill and Daniel P. Robinson. A Stochastic Sequential Quadratic Optimization Algorithm for Nonlinear Equality Constrained Optimization with Rank- Deficient Jacobians. Mathematics of Operations Research, 2023, DOI: 10.1287/moor.2021.0154. (Download PDF, MOR Online)
  2. Liyuan Cao, Albert S. Berahas and Katya Scheinberg. First-and Second-Order High Probability Complexity Bounds for Trust-Region Methods with Noisy Oracles. Mathematical Programming, 2023, DOI: 10.1007/s10107-023-01999-5. (Download PDF, MAPR Online)
  3. Albert S. Berahas, Jiahao Shi, Zihong Yi and Baoyu Zhou. Accelerating Stochastic Sequential Quadratic Programming for Equality Constrained Optimization using Predictive Variance Reduction. Computational Optimization and Applications, 2023, DOI: 10.1007/s10589-023-00483-2. (Download PDF, COAP Online)
  4. Cheoljoon Jeong, Ziang Xu, Albert S. Berahas, Eunshin Byon, and Kristen Cetin. Multiblock parameter calibration in computer models. INFORMS Journal on Data Science, 2023, DOI: 10.1287/ijds.2023.0029. (Download PDF, IJDS Online, Supplementary Material)
  5. Albert S. Berahas, Oumaima Sohab, and Luis Nunes Vicente. Full-low evaluation methods for derivative-free optimization. Optimization Methods and Software, 2022, 38(2), pp. 386-411. (Download PDF, OMS Online)
  6. Albert S. Berahas, Liyuan Cao, Krzysztof Choromanski and Katya Scheinberg. A Theoretical and Empirical Comparison of Gradient Approximations in Derivative-Free Optimization. Foundations of Computational Mathematics, 2022, 22(2), 507-560. (Download PDF, FoCM Online)
  7. Albert S. Berahas, Frank E. Curtis and Baoyu Zhou. Limited-Memory BFGS with Displacement Aggregation. Mathematical Programming, 2022, 194(1), pp. 121-157. (Download PDF, MAPR Online)
  8. Rinav Pillai, Vassilis Triantopoulos, Albert S. Berahas, Matthew Brusstar, Ruonan Sun, Tim Nevius and André L Boehman. Modeling and Predicting Heavy-Duty Vehicle Engine-Out and Tailpipe Nitrogen Oxide (NOx) Emissions Using Deep Learning. Frontiers in Mechanical Engineering, 2022, DOI: 10.3389/fmech.2022.840310. (Download PDF, Frontiers Online)
  9. Albert S. Berahas, Frank E. Curtis, Daniel P. Robinson and Baoyu Zhou. Sequential Quadratic Optimization for Nonlinear Equality Constrained Stochastic Optimization. SIAM Journal on Optimization, 2021, 31(2), pp. 1352-1379. (Download PDF, SIOPT Online)
  10. Albert S. Berahas, Liyuan Cao and Katya Scheinberg. Global Convergence Rate Analysis of a Generic Line Search Algorithm with Noise. SIAM Journal on Optimization, 2021, 31(2), pp. 1489-1518. (Download PDF, SIOPT Online)
  11. Albert S. Berahas, Majid Jahani, Peter Richtárik and Martin Takáč. Quasi-Newton Methods for Machine Learning: Forget the Past, Just Sample. Optimization Methods and Software, 2022, 37(5), pp. 1668-1704. (Download PDF, OMS Online, Supplementary Material)
  12. Albert S. Berahas, Raghu Bollapragada and Ermin Wei. On the Convergence of Nested Distributed Gradient Methods with Multiple Consensus and Gradient Steps. IEEE Transactions on Signal Processing, 2021, 69, pp. 4192-4203. (Download PDF, IEEE Xplore)
  13. Albert S. Berahas, Raghu Bollapragada and Jorge Nocedal. An Investigation of Newton-Sketch and Subsampled Newton Methods. Optimization Methods and Software, 2020, 35(4), pp. 661–680. (Download PDF, OMS Online, Supplementary Material)
  14. Albert S. Berahas and Martin Takáč, A Robust Multi-Batch L-BFGS Method for Machine Learning. Optimization Methods and Software, 2020, 35(1), pp. 191-219. (Download PDF, OMS Online, Supplementary Material)
  15. Albert S. Berahas, Richard Byrd and Jorge Nocedal. Derivative-Free Optimization of Noisy Functions via Quasi-Newton Methods. SIAM Journal on Optimization, 2019, 29(2), pp. 965-993. (Download PDF, SIOPT Online)
  16. Albert S. Berahas, Raghu Bollapragada, Nitish Shirish Keskar and Ermin Wei. Balancing Communication and Computation in Distributed Optimization. IEEE Transactions on Automatic Control, 2019, 64(8), pp. 3141-3155. (Download PDF, IEEE Xplore)
Conference Publications
  1. Majid Jahani, Mohammadreza Nazari, Rachael Tappenden, Albert S. Berahas and Martin Takáč. SONIA: A Symmetric Blockwise Truncated Optimization Algorithm. 24th International Conference on Artificial Intelligence and Statistics (AISTATS), 2021, pp. 487-495. (Download PDF, Supplementary Material)
  2. Majid Jahani, Mohammadreza Nazari, Sergey Rusakov, Albert S. Berahas and Martin Takáč. Scaling Up Quasi-Newton Algorithms: Communication Efficient Distributed SR1. 6th Annual Conference on Machine Learning, Optimization and Data Science (LOD), 2020, pp. 41-54. (Download PDF, LOD 2021)
  3. Sudeep Metha, Ved Patel and Albert S. Berahas. Auction-Based Preferential Shift Scheduling: A Case Study on the Lehigh University Libraries. Institute of Industrial and Systems Engineers (IISE) Conference and Expo, 2021. (Download PDF)
  4. Zheng Shi, Nur Sila Gulgec, Albert S. Berahas, Shamim N. Pakzad, Martin Takáč. Finite Difference Neural Networks: Fast Prediction of Partial Differential Equations. 2020 19th IEEE International Conference on Machine Learning and Applications (ICMLA), 2020, pp. 130-135. (Download PDF, IEEE Xplore)
  5. Albert S. Berahas, Charikleia Iakovidou and Ermin Wei. Nested Distributed Gradient Methods with Adaptive Quantized Communication. 58th IEEE Conference on Decision and Control (CDC), Nice, France, 2019, pp. 1519-1525. (Download PDF, IEEE Xplore)
  6. Albert S. Berahas, Jorge Nocedal and Martin Takáč, A Multi-Batch L-BFGS Method for Machine Learning. 2016 Advances In Neural Information Processing Systems (NeurIPS), Barcelona, Spain, Dec 2016, pp. 1055-1063. (Download PDF, Supplementary Material)
  7. Nitish Shirish Keskar and Albert S. Berahas, adaQN: An adaptive quasi-Newton algorithm for training RNNs. 2016 European Conference Machine Learning and Knowledge Discovery in Databases (ECML PKDD), Riva del Garda, Italy, Sept 2016, pp. 1-16. (Download PDF, ECML 2016)
  8. Michael Iliadis, Leonidas Spinoulas, Albert S. Berahas, Haohong Wang, and Aggelos K. Katsaggelos, Multi-model robust error correction for face recognition. 2016 IEEE International Conference on Image Processing (ICIP), Phoenix, Arizona, Sept 2016, pp. 3229-3233. (IEEE Xplore)
  9. Michael Iliadis, Leonidas Spinoulas, Albert S. Berahas, Haohong Wang, and Aggelos K. Katsaggelos, Sparse representation and least squares-based classification in face recognition. 2014 IEEE 22nd European Signal Processing Conference (EUSIPCO), Lisbon, Portugal, Sept 2014, pp. 526-530. (IEEE Xplore)
Workshop Papers & Technical Reports
  1. Albert S. Berahas, Majid Jahani and Martin Takáč. Sampled Quasi-Newton Methods for Deep Learning. OPT 2019: Optimization for Machine Learning Workshop (NeurIPS 2019) (Download PDF)
  2. Albert S. Berahas, Liyuan Cao, Krzysztof Choromanski and Katya Scheinberg. Linear interpolation gives better gradients than Gaussian smoothing in derivative-free optimization. Technical Report, Lehigh University 2019. (Download PDF)

Working/Submitted Papers

  1. (Status: Submitted) Albert S. Berahas, Raghu Bollapragada and Baoyu Zhou. An Adaptive Sampling Sequential Quadratic Programming Method for Equality Constrained Stochastic Optimization. arXiv preprint arXiv:2206.00712, 2022. (Download PDF)
  2. (Status: Submitted) Vivak Patel and Albert S. Berahas. Gradient Descent in the Absence of Global Lipschitz Continuity of the Gradients: Convergence, Divergence and Limitations of its Continuous Approximation. arXiv preprint arXiv:2210.02418, 2022. (Download PDF)
  3. (Status: Submitted) Albert S. Berahas, Miaolan Xie and Baoyu Zhou. A Sequential Quadratic Programming Method with High Probability Complexity Bounds for Nonlinear Equality Constrained Stochastic Optimization. arXiv preprint arXiv:2301.00477, 2023. (Download PDF)
  4. (Status: Submitted) Albert S. Berahas, Raghu Bollapragada and Shagun Gupta. Balancing Communication and Computation in Gradient Tracking Algorithms for Decentralized Optimization. arXiv preprint arXiv:2303.14289, 2023. (Download PDF)
  5. (Status: Submitted) Xubo Yue, Raed Al Kontar, Albert S. Berahas, Yang Liu, Zhenghao Zai, Kevin Edgar, and Blake N. Johnson. Collaborative and Distributed Bayesian Optimization via Consensus: Showcasing the Power of Collaboration for Optimal Design. arXiv preprint arXiv:2306.14348, 2023. (Download PDF)
  6. (Status: Submitted) Suhail M. Shah, Albert S. Berahas and Raghu Bollapragada. Adaptive Consensus: A network pruning approach for decentralized optimization. arXiv preprint 2309.02626, 2023. (Download PDF)
  7. (Status: Submitted) Albert S. Berahas, Lindon Roberts and Fred Roosta. Non-uniform smoothness for gradient descent. arXiv preprint 2311.08615, 2023. (Download PDF)

Thesis

  1. Albert S. Berahas, Methods for Large Scale Nonlinear and Stochastic Optimization, March 2018, Northwestern University. (Proquest, Download PDF)