In this paper, we explore the potential of artificial intelligence (AI) to address challenges in terahertz ultra-massive multiple-input multiple-output (THz UM-MIMO) systems. We identify three key challenges for transceiver design: "hard to compute," "hard to model," and "hard to measure," and argue that AI can provide promising solutions. We propose three research roadmaps for AI algorithms tailored to THz UM-MIMO systems. The first, model-driven deep learning (DL), emphasizes leveraging domain knowledge and using AI to enhance bottleneck modules in established signal processing or optimization frameworks. We discuss four steps: algorithmic frameworks, basis algorithms, loss function design, and neural architecture design. The second roadmap presents channel station information (CSI) foundation models to unify transceiver module design by focusing on the wireless channel. We propose a compact foundation model to estimate wireless channel score functions, serving as a prior for designing transceiver modules. We outline four steps: general frameworks, conditioning, site-specific adaptation, and joint design of CSI models and model-driven DL. The third roadmap explores applying pre-trained large language models (LLMs) to THz UM-MIMO systems, with applications in estimation, optimization, searching, network management, and protocol understanding. Finally, we discuss open problems and future research directions.
翻译:暂无翻译