The 2024 Real Time Communication Conference at Illinois Tech was an electrifying event, showcasing emerging technologies across Voice, WebRTC, IoT/Edge, and groundbreaking research. But if you ask me, the real magic happens in the conversations between sessions. These impromptu chats with attendees always spark new ideas, collaborations, and insights that you won’t find on any slide deck. It’s a space where cutting-edge tech meets human curiosity and creativity, making for an unforgettable experience.
I had the pleasure of presenting two sessions this year, both deeply focused on AI’s transformative potential. From training machine learning models for medical analysis to mining digital conversations for actionable insights, here’s a recap of the key takeaways from both sessions—and resources to keep the learning going.
—
Session 1: Machine Learning for Good – Training Models for Medical Analysis
In this keynote, co-presented with Nikki-Rae Alkema, we explored how machine learning is reshaping healthcare, especially in diagnostics. We focused on multi-modal/model — the fusion of audio, video, and sensor inputs to catch conditions like Parkinson’s Disease early. By analyzing subtle cues across different data types, we’re not just looking at isolated symptoms but building a more comprehensive picture of patient health.
This session emphasized the human aspect of AI. It’s not about replacing healthcare professionals but augmenting their abilities. Every algorithm, every data point analyzed, translates to real human stories and health outcomes. The goal? To move healthcare from a reactive to a proactive stance, where early detection becomes the norm rather than the exception.
- Key Resources:
- Session Recording: Watch the Recording
- Slides: Download here
- Coding Resources: Access the GitHub repository
This work underscores the potential for machine learning to empower medical professionals with insights that weren’t possible before, bringing us closer to a future where AI truly enhances human care.
—
Session 2: Mining Conversations – Building NLP Models to Decode the Digital Chatter
In our increasingly digital world, conversation data is a treasure trove of insights. This session dove into the intricacies of Natural Language Processing (NLP), specifically how to build multiple NLP models to work in concert. Whether it’s Slack messages, Zoom calls, or social media chatter, there’s a wealth of unstructured data waiting to be harnessed.
We walked through collecting raw data from WebRTC applications, then cleaning, tokenizing, and preparing it for machine learning pipelines. This process enables us to extract meaningful insights, classify content, and recognize entities—turning raw digital chatter into a strategic asset.
- Key Resources:
- Session Recording: Watch the Recording
- Slides: Download here
- Coding Resources: Access the GitHub repository
Whether you’re analyzing customer service interactions or mining social media for trends, these NLP techniques open doors to more profound, data-driven insights, directly applicable to real-world use cases.
—
The Magic of In-Between Sessions: Final Thoughts
What makes the RTC Conference truly special is the community. Between presentations, I had fascinating discussions with industry leaders, researchers, and fellow AI enthusiasts. These conversations often linger on the edges of what we’re presenting, pushing ideas further and sparking fresh perspectives. From discussing the ethics of AI in diagnostics to exploring how NLP can evolve to understand more nuanced human emotions, these interactions made for a vibrant and thought-provoking experience.
If you missed the event, the session recordings are available through the official conference site now! Take a look at the slides, code and more! Here’s to embracing AI’s potential together—until next time!