Loading…
Tuesday October 29, 2024 10:15am - 11:15am EDT
In this demonstration/tutorial, we focus on utilization of large-scale models (LLM) for real-time data modeling and prediction, particularly within the context of future wireless communications and networks. This research, carried out at the Center for Vehicle Communications and Networks (CVCC) located at the University of Michigan, Dearborn, has been supported by local autmobile industry companies in recent years.

The Transformer, pivotal in GPT models, excels in processing sequential data like text. It converts text into numerical tokens, adds positional encodings for token order, and employs attention mechanisms for focused processing. Multi-Head Attention allows for simultaneous consideration of various data relationships. With a layered structure, the model efficiently captures complex patterns. Predictions involve generating a probability distribution over the vocabulary for each position.
Transformers, naturally but non-trivia, excel in time series modeling. The incorporation of Language Models such as GPT-3 or BERT enriches real-time data exploration in wireless communications and networks, elevating capabilities in analysis, modeling, prediction, and pattern recognition. Research areas that will be discussed in detail encompass generative wireless channel modeling utilizing experimental data across microwave and millimeter bands, a C-V2X beamforming forecast model derived from field-tested data at UM-Dearborn, and projections concerning traffic and active user behavior within an innovative AI driver protocol for Internet of Things (IoT) applications.
Tuesday October 29, 2024 10:15am - 11:15am EDT
Vandenberg The Michigan League

Attendees (2)


Sign up or log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link