Date of Award
1-2026
Document Type
Thesis
Degree Name
Master of Science (MS)
College/School
School of Computing
Department/Program
School of Computing
Thesis Sponsor/Dissertation Chair/Project Chair
Bharath K. Samanthula
Committee Member
Boxiang Dong
Committee Member
Jiacheng Shang
Abstract
Logistic regression has found extensive use as a supervised machine learning algorithm due to its simplicity and efficiency in binary and multivariate classification tasks. As data sharing grows across connected devices, safeguarding sensitive personal and industrial information is of increased importance. Privacy-preserving machine learning techniques such as differential privacy and homomorphic encryption offer mathematically rigorous security guarantees, but introduce difficult accuracy, privacy loss, and computational overhead issues. This thesis investigates PPML for logistic regression through a collaborative mini-batch training framework. I propose and implement an ordered mini-batch strategy, compare it to standard shuffled methods, then integrate differential privacy noise injection and homomorphic encryption-style encrypted inference. Experiments are conducted on two real-world datasets to demonstrate that the ordered batch method can match or exceed unordered training in both no-privacy and privacy-preserving cases while maintaining practical encrypted inference latency. I then quantify the trade-offs between privacy budget, model performance, and resource usage. Finally, extensions to deeper models and applications as well as larger hybrid cryptographic protocol setups are discussed as potential areas for future research.
File Format
Recommended Citation
Leone, Ryan, "Ordered Mini-batch Training for Differentially Private and Encrypted Logistic Regression" (2026). Theses, Dissertations and Culminating Projects. 1602.
https://digitalcommons.montclair.edu/etd/1602
Included in
Artificial Intelligence and Robotics Commons, Cybersecurity Commons, Information Security Commons