Communication-Efficient Learning of Deep Networksfrom Decentralized Data标题: Communication-Efficient Learning of Deep Networks from Decentralized Data
作者:
H. Brendan McMahan
Eider Moore
Daniel Ramage
Seth Hampson
Blaise Agüera y Arcas
机构: Google, Inc., 651 N 34th St., Seattle, WA 98103 USA
发表会议:
ICML 2024:Fine-Tuning Large Language Models with User-Level Differential Privacy
基本信息
标题: Fine-Tuning Large Language Models with User-Level Differential Privacy
作者:
Zachary Charles, Google Research, Seattle, WA, USA, zachcharles@google.com
Arun Ganesh, Google Research, Seattle, WA, USA, arunganesh@google.com
Ryan McKenna, Google Research, Seattle, WA, USA, mckennar@google.com
基本信息
标题:Federated Learning of Large Language Models with Parameter-Efficient Prompt Tuning and Adaptive Optimization
作者:
Tianshi Che (Auburn University, Auburn, United States)
Ji Liu (Hithink RoyalFlush Information Network Co., Ltd., Hangzhou, Zhejiang, China)
Yang Zhou (Auburn University, Auburn,
基本信息
标题:PrE-Text: Training Language Models on Private Federated Data in the Age of LLMs
作者:Charlie Hou, Akshat Shrivastava, Hongyuan Zhan, Rylan Conway, Trang Le, Adithya Sagar, Giulia Fanti, Daniel Lazar
机构:
Charlie Hou, Giulia Fanti:Carnegie Mellon University, Department of Electrical and Comput
基本信息
标题: FedBiOT: LLM Local Fine-tuning in Federated Learning without Full Model
作者:
Feijie Wu (Purdue University, wu1977@purdue.edu)
Zitao Li (Alibaba Group, zitao.l@alibaba-inc.com)
Yaliang Li (Alibaba Group, yaliang.li@alibaba-inc.com)
Bolin Ding (Alibaba Group, bolin.ding@alibaba-inc.com)
Jing
基本信息
标题:
EDGE-LLM: Enabling Efficient Large Language Model Adaptation on Edge Devices via Layerwise Unified Compression and Adaptive Layer Tuning & Voting
作者:
Zhongzhi Yu, Zheng Wang, Yuhan Li, Haoran You, Ruijie Gao, Xiaoya Zhou, Sreenidhi Reedy Bommu, Yang (Katie) Zhao, Yingyan (Celine) Lin
作
基本信息
标题: SPT: Fine-Tuning Transformer-based Language Models Efficiently with Sparsification
作者: Yuntao Gui, Xiao Yan, Peiqi Yin, Han Yang, James Cheng
机构:
The Chinese University of Hong Kong, Hong Kong SAR, China
Southern University of Science and Technology, China
关键词: language models, transforme