[Paper] Modular Foundation Model Inference at the Edge: Network-Aware Microservice Optimization
Source: arXiv - 2601.19563v1
Overview
Foundation models (FMs) unlock unprecedented multimodal and multitask intelligence, yet their cloud‑centric deployment precludes real‑time responsiveness and compromises user privacy. Meanwhile, monolithic execution at the edge remains infeasible under stringent resource limits and uncertain network dynamics. To bridge this gap, we propose a microservice‑based FM inference framework that exploits the intrinsic functional asymmetry between heavyweight core services and agile light services. Our two‑tier deployment strategy ensures robust Quality of Service (QoS) under resource contention.
Specifically, core services are placed statically via a long‑term network‑aware integer program with sparsity constraints to form a fault‑tolerant backbone. Light services are orchestrated dynamically by a low‑complexity online controller that integrates effective capacity theory with Lyapunov optimization, providing probabilistic latency guarantees under real‑time workload fluctuations. Simulations demonstrate that our framework achieves over 84 % average on‑time task completion with moderate deployment costs and maintains strong robustness as the system load scales.
Key Contributions
- cs.DC
Methodology
Please refer to the full paper for detailed methodology.
Practical Implications
This research contributes to the advancement of cs.DC.
Authors
- Juan Zhu
- Zixin Wang
- Shenghui Song
- Jun Zhang
- Khaled Ben Letaief
Paper Information
- arXiv ID: 2601.19563v1
- Categories: cs.DC
- Published: January 27, 2026
- PDF: Download PDF