Tag Archives: Real Time Communications Conference & Expo

2024 RTC Conference Recap: Shining a Spotlight on AI in Healthcare and Voice AI Assistants

The 2024 Real Time Communication Conference at Illinois Tech was an electrifying event, showcasing emerging technologies across Voice, WebRTC, IoT/Edge, and groundbreaking research. But if you ask me, the real magic happens in the conversations between sessions. These impromptu chats with attendees always spark new ideas, collaborations, and insights that you won’t find on any slide deck. It’s a space where cutting-edge tech meets human curiosity and creativity, making for an unforgettable experience.

I had the pleasure of presenting two sessions this year, both deeply focused on AI’s transformative potential. From training machine learning models for medical analysis to mining digital conversations for actionable insights, here’s a recap of the key takeaways from both sessions—and resources to keep the learning going.

Session 1: Machine Learning for Good – Training Models for Medical Analysis

In this keynote, co-presented with Nikki-Rae Alkema, we explored how machine learning is reshaping healthcare, especially in diagnostics. We focused on multi-modal/model — the fusion of audio, video, and sensor inputs to catch conditions like Parkinson’s Disease early. By analyzing subtle cues across different data types, we’re not just looking at isolated symptoms but building a more comprehensive picture of patient health.

This session emphasized the human aspect of AI. It’s not about replacing healthcare professionals but augmenting their abilities. Every algorithm, every data point analyzed, translates to real human stories and health outcomes. The goal? To move healthcare from a reactive to a proactive stance, where early detection becomes the norm rather than the exception.

This work underscores the potential for machine learning to empower medical professionals with insights that weren’t possible before, bringing us closer to a future where AI truly enhances human care.

Session 2: Mining Conversations – Building NLP Models to Decode the Digital Chatter

In our increasingly digital world, conversation data is a treasure trove of insights. This session dove into the intricacies of Natural Language Processing (NLP), specifically how to build multiple NLP models to work in concert. Whether it’s Slack messages, Zoom calls, or social media chatter, there’s a wealth of unstructured data waiting to be harnessed.

We walked through collecting raw data from WebRTC applications, then cleaning, tokenizing, and preparing it for machine learning pipelines. This process enables us to extract meaningful insights, classify content, and recognize entities—turning raw digital chatter into a strategic asset.

Whether you’re analyzing customer service interactions or mining social media for trends, these NLP techniques open doors to more profound, data-driven insights, directly applicable to real-world use cases.

The Magic of In-Between Sessions: Final Thoughts

What makes the RTC Conference truly special is the community. Between presentations, I had fascinating discussions with industry leaders, researchers, and fellow AI enthusiasts. These conversations often linger on the edges of what we’re presenting, pushing ideas further and sparking fresh perspectives. From discussing the ethics of AI in diagnostics to exploring how NLP can evolve to understand more nuanced human emotions, these interactions made for a vibrant and thought-provoking experience.

If you missed the event, the session recordings are available through the official conference site now! Take a look at the slides, code and more! Here’s to embracing AI’s potential together—until next time!

Unlocking Conversations: Hands-On NLP for Real-World Data Mining

Hey there, tech enthusiasts! I’m thrilled to share that I’ll be hosting an exciting workshop at the upcoming Open Data Science Conference (ODSC). Titled “Building Multiple Natural Language Processing Models to Work in Concert Together”, this workshop will give you a practical, hands-on approach to creating and orchestrating NLP models. It’s not just another “hello world” session—this is about tackling real-world data and making it work for you.

Session Info:
Building Multiple Natural Language Processing Models to Work in Concert Together
Date: Oct 30, 2024
Time: 4:35pm

Why NLP and Why Now?

As conversations around the world explode in number, the need to make sense of them has become more critical than ever. Think about it: 1.5 billion messages on Slack every week, 300 million daily virtual meetings on Zoom at peak, and 260 million conversations happening on Facebook every day. The sheer scale of this data is astounding. But more than that, these conversations have transformed social platforms into treasure troves of information, offering insights into emerging trends, new associations, and evolving narratives.

NLP

At the workshop, we’ll delve into how to capture, analyze, and gain insights from this data using NLP. Whether you’re looking to spot trends, extract key information, or mine metadata, this session will provide you with the tools and techniques to turn this overwhelming amount of unstructured conversation data into something meaningful.

What You Can Expect

This workshop will be hands-on and highly interactive, featuring three primary components:

  1. Building a Question Classifier: We’ll start with a straightforward model that classifies sentences as questions or non-questions. You’ll see that even seemingly simple tasks can get complex as we deal with language’s natural ambiguity.

  2. Creating a Named Entity Recognition (NER) Model: Next, we’ll move into identifying specific entities within text, such as names, places, and organizations. I’ll show you how to gather, clean, and process data to build a reliable NER model that can extract meaningful information from conversations.

  3. Developing a Voice AI Assistant Demo: We’ll bring it all together by integrating both models into a voice assistant app that uses a RESTful API to process input and return classified and annotated data. This is where you’ll see how these models can work together in a real-world application, adding layers of context and relevance to raw data.

Why Attend?

There are plenty of reasons to be excited about this workshop, but here are a few highlights:

  • Hands-on Learning: We’ll be coding live! For those that are less technical and/or don’t have their laptop prerequisites, I’ll be using Jupyter notebooks in Google Colab, so everyone can follow along.
  • Real-World Applications: While many workshops focus on isolated NLP models, we’ll be tackling multiple models and showing how they can be combined for enhanced functionality. It’s a rare opportunity to see how these technologies can be applied in real-world scenarios.

  • Open Resources: I’ll provide code, data resources, and examples that you can take with you, adapt, and use on your projects. This workshop isn’t just about learning theory—it’s about equipping you with tools you can use.

See You at ODSC!

I’m incredibly excited to share this workshop with you all and to dive into the nitty-gritty of NLP. Whether you’re an experienced data scientist, an NLP enthusiast, or just curious about how these systems work, there will be something for you. Plus, you’ll walk away with new skills and practical examples that can help you build better models and unlock new insights from conversation data.

So, if you’re planning to attend ODSC, be sure to check out this session. You won’t want to miss it!

Workshop Info:
Building Multiple Natural Language Processing Models to Work in Concert Together
Date: Oct 30, 2024
Time: 4:35pm

Mining Conversations: Building NLP Models to Decode the Digital Chatter

The world has gone digital, and so have our conversations. With over 1.5 billion messages sent weekly on Slack, 300 million daily virtual meetings on Zoom at its peak, and millions of interactions across Facebook, TikTok, and other platforms, the volume of conversation data is staggering. These conversations hold valuable insights, from trend detection to user behavior analysis. How do we extract and mine that data?

streaming conversations from the digital realm

At the 2024 RTC Conference at Illinois Tech, we’ll dive into the cutting-edge world of Natural Language Processing (NLP) and data mining to decode this digital chatter. I will be presenting a session titled Building Multiple Natural Language Processing Models to Work In Concert Together on October 8, 2024, at 4:15 PM. If you’re passionate about how AI and machine learning are transforming industries like healthcare, don’t miss it!

Here is a brief rundown on what I will be covering during the session…

Breaking Down Conversations

Data is power, and conversation data is a goldmine waiting to be tapped. In this session, we’ll go step by step through the process of creating and training NLP models that can understand the context and meaning behind messages, whether from video meetings, audio calls, or text conversations.

It all starts with data. We’ll begin by learning how to collect raw conversation data from various sources, such as WebRTC applications like LiveKit. Once collected, the next challenge is preprocessing this data. We’ll explore strategies to clean and prepare text for machine learning pipelines, including noise reduction and tokenization.

Natural Language Processing

Once the data is ready, we’ll develop machine learning models to extract critical information to classify sentences and perform named entity recognition. We’ll cover how to build these models using Python, PyTorch, and other state-of-the-art NLP tools.

This session will highlight live demos, where I’ll showcase how to deploy and integrate these models into workflows for practical applications, such as customer service analysis, social media trend detection, and even compliance monitoring.

Why Attend This Session?

At the end of the session, you’ll walk away with more than just theory – you’ll get access to working code and resources that you can immediately apply to your projects. Whether you’re building conversation analytics tools for a social network or mining customer feedback from virtual meetings, this session is designed to provide actionable takeaways. By understanding how to train NLP models to analyze conversations, you can transform raw data into valuable insights for your organization.

If you’re interested in learning more about ML and NLP, I invite you to attend my session at the 2024 RTC Conference at Illinois Tech, Building Multiple Natural Language Processing Models to Work In Concert Together on Tuesday, October 8, 2024, at 4:15 PM.

2024 RTC Conference at Illinois Tech

You can use the discount code FFSPKR to get $200 off registration. Don’t miss this opportunity to explore the future of machine learning and NLP – register today and be part of the conversation!