Category Archives: Conferences

NVIDIA GTC CFP Speaking Session Analysis: Emphasizing Solutions Over “Things”

NVIDIA’s GPU Technology Conference (GTC) just wrapped up in Paris, and GTC is often dubbed the “Super Bowl of AI”. I was fortunate enough to speak virtually on the topic of Explainable AI and attend the North American version in San Jose this year. I was impressed by the quality of the sessions and the breadth of subject matter covered. You can geek out on the cutting edge of AI techniques to discover some of the amazing things happening today, such as AI-assisted healthcare.

NVIDIA GTC Logo

Part of my day-to-day job is thinking about new interesting ideas and topics to present at AI/ML conferences like GTC. Because of this, I thought it would be worthwhile to examine what the NVIDIA GTC selection committee considers talks worthy of being presented in-person on stage. GTC might be hosted by the world’s leading GPU maker, but you’d be mistaken to think that it’s just about the hardware they build. In reality, the overwhelming majority of sessions focus on software, AI solutions, and what you can do with NVIDIA’s technology… not the chips themselves. This isn’t just my opinion; it’s supported by the numbers and the way GTC is organized. That’s the focus of this blog post here… let’s dive right in.

GTC Is About What You Can Do, Not Just What You Have

NVIDIA is no longer just a chipmaker cranking out GPUs. The company has been transforming into a full-stack AI platform provider. As one observer put it, NVIDIA went “from being a hardware company” to also wanting to be an AI software company. This shift is evident at GTC. The conference features hundreds of sessions across AI domains from computer vision and generative AI to robotics and healthcare… showcasing real-world applications and breakthroughs. In fact, GTC 2025 boasted over 1,000 sessions with 2,000+ speakers (in-person and virtual), covering everything from large language models to cloud computing and scientific discovery. The focus is clear: it’s on how NVIDIA’s platform is used to solve tough problems and transform industries, rather than on plain product pitches.

Problem Solving

This approach makes sense. People attend GTC to learn and be inspired by what’s possible with AI. They want to hear how a researcher used GPUs to decode the human genomes faster, or how a startup is deploying AI at the edge in hospitals… not a sales presentation about specs or cloud capabilities. NVIDIA’s own messaging around GTC highlights “breakthroughs happening now” and how AI is “powering the everyday brands that shape people’s lives” across various industries. The underlying hardware is crucial, but it’s the enabler, not the star of the show.

By the Numbers: Software and Solutions Dominate GTC Sessions

I took a deep dive into the latest GTC session catalog in Paris (this was also true of San Jose) and categorized the typical speaking sessions (excluding workshops, trainings, casual chats, etc). There were 485 in-person talks at the GTC event in this European event last month, spanning multiple topic tracks. The breakdown by topic highlights that software, AI applications, and data science topics dominate the agenda, whereas discussions on pure hardware are relatively few. Here’s a snapshot:

  • Generative AI: 81 talks (only 9 were sponsored slots, ~11% “paid”)

  • Simulation / Modeling / Design: 67 talks (2 sponsored, ~3%)

  • Data Science: 25 talks (0 sponsored, 0%)

  • Computer Vision: 21 talks (all content, 0% sponsored)

  • Edge Computing: 16 talks (0 sponsored, 0%)

  • MLOps (AI Deployment): 10 talks (all content, 0% sponsored)

  • Natural Language Processing: 6 talks (all content, 0% sponsored)

  • Tooling (Dev Tools, Frameworks): 33 talks (1 sponsored, ~3%)

  • Cloud Services and Infrastructure: 55 talks (27 sponsored, ~49%)

  • Hardware (GPUs & chips): 25 talks (16 sponsored, ~64%)

In plain terms, the “Generative AI” track alone had over 80 sessions, reflecting the huge interest in what people are building with large language models and AI creativity. “Simulation/Modeling/Design” was another big track with dozens of sessions. Traditional AI application areas, such as vision, NLP, and data science, collectively accounted for many talks as well. These are the kinds of sessions where speakers share research results, developer tips, and success stories using NVIDIA’s software stacks and GPUs.

Pay-to-Play

Notably, none of the talks in Data Science, Vision, NLP, Edge, or MLOps were paid talks… they were all merit-based content. The same was true for the vast majority of Generative AI and Simulation track talks (over 90% of those were non-sponsored content). This means the speakers earned their spot by having something interesting to say, not by buying a slot.

Contrast that with the Cloud and Hardware categories. The Cloud track (covering topics like cloud GPU services, data centers, and infrastructure) had about half of its talks come from sponsored sessions. The Hardware track, sessions about new chips, systems, and so on, was even more skewed: nearly two-thirds were “paid talks.” In other words, many of the sessions about cloud or hardware were more or less vendor presentations that likely came as part of a sponsorship package. This wasn’t a fluke or an oversight by NVIDIA; it appears to be by design.

Why Some Tracks Have So Many Sponsored Talks

If a conference track has a high percentage of sponsored talks, it’s a sign that the organizers expect those proposals could be salesy or promotional in nature. NVIDIA knows that a talk titled “Buy Our Cloud GPU Service!” isn’t what most attendees are eager to hear in any kind of session. The non-tech equivalent of this would be walking into a room only to find out you are trapped listening to a pitch to buy a timeshare in the Bahamas or something.

Not a Timeshare

By comparison, a talk titled “How We Optimized Healthcare NLP Models on GPUs in the Cloud” is much more appealing… if it focuses on the solution and not just a particular product. (Big if… if you ask me) GTC organizers seem to allow the more promotional talks a space at the table only if the presenters pay for the privilege (through sponsorship). This keeps the main content tracks filled with talks that are interesting to the audience on technical or innovative merits, and pushes anything overtly commercial into clearly marked slots.

Think about it: Hardware and Cloud are inherently areas where NVIDIA’s partners (or NVIDIA itself) might be tempted to pitch their latest and greatest. It’s not that hardware advances aren’t important… after all, NVIDIA is one of these hardware chip makers. GTC always has some exciting new GPU or system announcements, which are typically covered in keynotes or select sessions. But an hour-long breakout session that is essentially a product advertisement is not what GTC is curated for (by design).

Therefore, if Cloud providers or hardware vendors wish to present their offerings, they often appear as sponsors (the data indicates that 49-64% of those sessions were paid). This is a strong hint of what NVIDIA is looking for (and not looking for) in the session selection process.

What GTC Wants: Inspiring Use-Cases, Open Sharing, and Real Techniques

NVIDIA GTC sessions are meant to educate, inspire, and enable the audience of developers, researchers, and tech leaders. The best sessions tell a story of solving a real problem using AI and GPUs. They often involve sharing code, frameworks, or lessons learned that others can apply. In fact, open-source tools and projects are frequently front and center. From NVIDIA’s own perspective, many of their recent announcements have incorporated open-source elements; for instance, open-source foundation models for robotics were highlighted at GTC 2025. This open theme extends to the sessions, where speakers frequently discuss using open-source libraries (such as PyTorch and CUDA libraries), contributing to them, or building solutions based on open standards.

This alignment with open source is no coincidence. The open source community thrives on knowledge-sharing and collaboration, not on sales pitches. GTC, in spirit, tries to cultivate that same vibe. Attendees should walk away with new ideas, sample code, or a GitHub repo to check out, and a sense of possibility.

Knowledge Sharing

NVIDIA’s own conference description emphasizes how AI is transforming everything from healthcare to automotive to finance… again focusing on applications and impact. As a GTC attendee myself, I’ve noticed that the energy is much higher in sessions where the speaker is teaching something or demoing a breakthrough. To be fair, even the sponsored talks can contain useful info… but only if the speaker remembers to focus on how to solve problems using their tech, rather than just what their tech does.

Another trend… GTC content has become increasingly practical and concrete over the years. One conference veteran noted that the 2024 GTC was about “what if” future possibilities, while 2025 shifted to “what is”, thereby focusing on how to advance current AI projects to the next level. That tells us NVIDIA is curating sessions that speak to today’s challenges and solutions, not just wild future speculation. If you’re proposing a talk, you’re more likely to be selected if you can demonstrate tangible results or techniques that attendees can use now (or very soon), as opposed to just theoretical ideas.

How to Get Your Talk Selected for AI/ML Conferences

From the data and observations above, we can distill a few clear guidelines for anyone hoping to speak at NVIDIA GTC (or really any top-tier AI/ML conference):

  • Focus on Solving a Problem: Frame your talk around a compelling problem or use case you tackled with AI. For example, instead of a generic “About Our GPU Product,” present “How We Reduced Training Time for Medical Imaging Models by 80%”. Show the audience how you did it, and share numbers, insights, or code repo. GTC selectors appreciate real-world applications that have a tangible impact.

  • Keep It Technical and Educational: Contrary to popular belief, GTC is a developer-centric conference. Attendees appreciate code snippets, demos, benchmarks, and concrete tips. Don’t shy away from the technical aspect. Explain how you achieved your results (e.g., which SDKs, which algorithms, which optimization tricks). Make it a learning experience for the audience.

  • Avoid Marketing Hype: Steer clear of marketing fluff or self-promotion in your proposal and content, but above all, talk about something people actually want to hear. If you wouldn’t sit in on a session, odds are others won’t want to either. Phrases like “revolutionary platform” or slides of your product catalog are red flags, NVIDIA’s organizers can sniff out a sales pitch a mile away… and they’ll route those to sponsored slots (or reject them). Be honest and straightforward about what the audience will get from your talk.

  • Highlight Open-Source and Community Angle: If your solution utilizes open-source tools or you’re open-sourcing your work, mention that. Talks that share code or frameworks with the community inherently feel more like knowledge-sharing (which is exactly the spirit GTC wants).

  • Showcase NVIDIA Tech in Context: Since this is NVIDIA’s conference, it doesn’t hurt (as in you should at least mention) that your solution leverages their technology, but do it in a way that feels natural. It’s fine (and even expected) to use CUDA, RTX, or NVIDIA SDKs in your project; just don’t make the discussion about those tools themselves a product. Instead, make it about what they enabled.

  • Keep the Audience in Mind: Ask yourself, “What will someone watching my talk learn or be inspired to do?” GTC is meant to spark new ideas. If your talk proposal answers that question with something concrete, you’re aligning well with what GTC wants.

Wrapping It All Up!

To sum it up, NVIDIA is seeking talks that ignite curiosity and demonstrate to attendees how to achieve great things with AI, utilizing the hardware and software as tools. The hardware announcements will always have their place on the big stage (keynotes and a few deep-dive sessions), but the heart of GTC is all the amazing stuff people are doing with that hardware/platform. By structuring the conference this way, NVIDIA keeps GTC valuable and authentic for its audience. As an AI/ML developer and data scientist myself, it’s the only way to run a conference or get me to sit through a session at one.

The Full Stop Thought… if you want to speak at GTC (or any AI/ML conference for that matter), bring real substance. Tell a story of innovation or problem-solving, backed by data and live demos. Align with the interests of the community (open science, open source, and cutting-edge applications). NVIDIA’s selection trends show they favor the inspiring engineer or researcher over the slick salesperson; I have definitely noticed this myself while entering this very data science driven world. The GPUs and cloud instances are just means to an end… what matters is the awesome things you accomplished with them. Keep that focus, and you’ll not only increase your chances of getting a CFP accepted but also deliver a talk that resonates with one of the largest AI audiences in the world. And that’s a win-win for everyone.

2024 RTC Conference Recap: Shining a Spotlight on AI in Healthcare and Voice AI Assistants

The 2024 Real Time Communication Conference at Illinois Tech was an electrifying event, showcasing emerging technologies across Voice, WebRTC, IoT/Edge, and groundbreaking research. But if you ask me, the real magic happens in the conversations between sessions. These impromptu chats with attendees always spark new ideas, collaborations, and insights that you won’t find on any slide deck. It’s a space where cutting-edge tech meets human curiosity and creativity, making for an unforgettable experience.

I had the pleasure of presenting two sessions this year, both deeply focused on AI’s transformative potential. From training machine learning models for medical analysis to mining digital conversations for actionable insights, here’s a recap of the key takeaways from both sessions—and resources to keep the learning going.


Session 1: Machine Learning for Good – Training Models for Medical Analysis

In this keynote, co-presented with Nikki-Rae Alkema, we explored how machine learning is reshaping healthcare, especially in diagnostics. We focused on multi-modal/model — the fusion of audio, video, and sensor inputs to catch conditions like Parkinson’s Disease early. By analyzing subtle cues across different data types, we’re not just looking at isolated symptoms but building a more comprehensive picture of patient health.

This session emphasized the human aspect of AI. It’s not about replacing healthcare professionals but augmenting their abilities. Every algorithm, every data point analyzed, translates to real human stories and health outcomes. The goal? To move healthcare from a reactive to a proactive stance, where early detection becomes the norm rather than the exception.

This work underscores the potential for machine learning to empower medical professionals with insights that weren’t possible before, bringing us closer to a future where AI truly enhances human care.


Session 2: Mining Conversations – Building NLP Models to Decode the Digital Chatter

In our increasingly digital world, conversation data is a treasure trove of insights. This session dove into the intricacies of Natural Language Processing (NLP), specifically how to build multiple NLP models to work in concert. Whether it’s Slack messages, Zoom calls, or social media chatter, there’s a wealth of unstructured data waiting to be harnessed.

We walked through collecting raw data from WebRTC applications, then cleaning, tokenizing, and preparing it for machine learning pipelines. This process enables us to extract meaningful insights, classify content, and recognize entities—turning raw digital chatter into a strategic asset.

Whether you’re analyzing customer service interactions or mining social media for trends, these NLP techniques open doors to more profound, data-driven insights, directly applicable to real-world use cases.


The Magic of In-Between Sessions: Final Thoughts

What makes the RTC Conference truly special is the community. Between presentations, I had fascinating discussions with industry leaders, researchers, and fellow AI enthusiasts. These conversations often linger on the edges of what we’re presenting, pushing ideas further and sparking fresh perspectives. From discussing the ethics of AI in diagnostics to exploring how NLP can evolve to understand more nuanced human emotions, these interactions made for a vibrant and thought-provoking experience.

If you missed the event, the session recordings are available through the official conference site now! Take a look at the slides, code and more! Here’s to embracing AI’s potential together—until next time!

Unlocking Conversations: Hands-On NLP for Real-World Data Mining

Hey there, tech enthusiasts! I’m thrilled to share that I’ll be hosting an exciting workshop at the upcoming Open Data Science Conference (ODSC). Titled “Building Multiple Natural Language Processing Models to Work in Concert Together”, this workshop will give you a practical, hands-on approach to creating and orchestrating NLP models. It’s not just another “hello world” session—this is about tackling real-world data and making it work for you.

Session Info:
Building Multiple Natural Language Processing Models to Work in Concert Together
Date: Oct 30, 2024
Time: 4:35pm

Why NLP and Why Now?

As conversations around the world explode in number, the need to make sense of them has become more critical than ever. Think about it: 1.5 billion messages on Slack every week, 300 million daily virtual meetings on Zoom at peak, and 260 million conversations happening on Facebook every day. The sheer scale of this data is astounding. But more than that, these conversations have transformed social platforms into treasure troves of information, offering insights into emerging trends, new associations, and evolving narratives.

At the workshop, we’ll delve into how to capture, analyze, and gain insights from this data using NLP. Whether you’re looking to spot trends, extract key information, or mine metadata, this session will provide you with the tools and techniques to turn this overwhelming amount of unstructured conversation data into something meaningful.

What You Can Expect

This workshop will be hands-on and highly interactive, featuring three primary components:

  1. Building a Question Classifier: We’ll start with a straightforward model that classifies sentences as questions or non-questions. You’ll see that even seemingly simple tasks can get complex as we deal with language’s natural ambiguity.
  2. Creating a Named Entity Recognition (NER) Model: Next, we’ll move into identifying specific entities within text, such as names, places, and organizations. I’ll show you how to gather, clean, and process data to build a reliable NER model that can extract meaningful information from conversations.
  3. Developing a Voice AI Assistant Demo: We’ll bring it all together by integrating both models into a voice assistant app that uses a RESTful API to process input and return classified and annotated data. This is where you’ll see how these models can work together in a real-world application, adding layers of context and relevance to raw data.

Why Attend?

There are plenty of reasons to be excited about this workshop, but here are a few highlights:

  • Hands-on Learning: We’ll be coding live! For those that are less technical and/or don’t have their laptop prerequisites, I’ll be using Jupyter notebooks in Google Colab, so everyone can follow along.
  • Real-World Applications: While many workshops focus on isolated NLP models, we’ll be tackling multiple models and showing how they can be combined for enhanced functionality. It’s a rare opportunity to see how these technologies can be applied in real-world scenarios.
  • Open Resources: I’ll provide code, data resources, and examples that you can take with you, adapt, and use on your projects. This workshop isn’t just about learning theory—it’s about equipping you with tools you can use.

See You at ODSC!

I’m incredibly excited to share this workshop with you all and to dive into the nitty-gritty of NLP. Whether you’re an experienced data scientist, an NLP enthusiast, or just curious about how these systems work, there will be something for you. Plus, you’ll walk away with new skills and practical examples that can help you build better models and unlock new insights from conversation data.

So, if you’re planning to attend ODSC, be sure to check out this session. You won’t want to miss it!

Workshop Info:
Building Multiple Natural Language Processing Models to Work in Concert Together
Date: Oct 30, 2024
Time: 4:35pm