S'MoRE: Structural Mixture of Residual Experts for LLM Fine-tuning Paper • 2504.06426 • Published Apr 8, 2025 • 2
view article Article What is MoE 2.0? Update Your Knowledge about Mixture-of-experts Apr 27, 2025 • 10
LLM-Rec: Personalized Recommendation via Prompting Large Language Models Paper • 2307.15780 • Published Jul 24, 2023 • 27