A new AI-driven technology developed by researchers at UNIST promises to significantly reduce data transmission loads during image transfer, paving the way for advancements in autonomous vehicles, remote surgery and diagnostics, and real-time metaverse rendering—applications that demand rapid, large-scale visual data exchange without delay.
Led by Professor Sung Whan Yoon from the Graduate School of Artificial Intelligence at UNIST, the research team developed Task-Adaptive Semantic Communication, an innovative wireless image transmission method that selectively transmits only the most essential semantic information relevant to the specific task. Their study is published in the IEEE Journal on Selected Areas in Communications.
Current wireless image transmission methods compress entire images without considering their underlying semantic structures—such as objects, layout, and relationships—resulting in bandwidth limitations and transmission delays that hinder real-time high-resolution image sharing.







