Nutanix’s Lee Caswell on GPT-in-a-Box, LLMs, and Managing Data

In a fast-evolving AI landscape, enterprises are racing to harness the power of large language models (LLMs) while managing the complex realities of data privacy, infrastructure, and operational flexibility. In this TechVoices interview, Lee Caswell, Senior Vice President of Product and Solutions Marketing at Nutanix, discusses “GPT in a Box”—an enterprise-ready solution designed to simplify AI deployment across on-prem, cloud, and edge environments.

Caswell explores how agentic AI is reshaping enterprise use cases, why data is the real differentiator in generative AI, among other AI-focused issues.

Key Points: GPT-in-a-Box, Training LLMs, Agentic AI 

  • Unlike just connecting to ChatGPT via API, GPT-in-a-Box offers infrastructure-ready support for multiple LLMs(like LLaMA 2, and Hugging Face models), enabling enterprise-grade customization and control.
  • The solution treats LLMs like applications, allowing companies to swap in new models over time without starting from scratch.
  • Enterprises want to train LLMs on proprietary data while maintaining security and regulatory compliance (e.g., HIPAA, GDPR).
  • Caswell notes that a model trained on proprietary data can be more sensitive than the data itself—making model control crucial.
  • The modern enterprise infrastructure must support a hybrid, distributed data environment with governance across all locations.

Key Quotes: “the models are becoming increasingly refined”

Caswell discussed the importance of models in AI, the details of GPT-in-a-Box, and the how infrastructure supports the rise of agentic AI.

The value of data over models:

“The model based on your data—trained on your data—is actually more sensitive than the data itself. Now you have a trained model on all of the data you have from all of the tests and procedures you’ve done. That becomes a core IP of how you’re synthesizing the data you have in this new model, and the models are becoming increasingly refined.”

The infrastructure behind GPT-in-a-Box:

“GPT-in-a-Box is a way to basically bring all the pieces together for customers who were worried about the infrastructure implications of AI. We’re giving you a single endpoint integration into all of the LLMs that you can now bring into your on-prem environment and run just like any other enterprise application.”

The rise of agentic AI and infrastructure demands:

“The agentic cycle is this: you get that first model and the first results, and then you’ll put it into another model—that’s your agent—saying, ‘Now I’m going to go and look at something like guardrails.’ You could do re-ranking, refine the results, and apply policies. That’s why IT infrastructure is growing in importance—because you need the processing power, storage, and flexibility to support this refinement across environments.”

Picture of James Maguire

James Maguire

An award-winning journalist, James has held top editorial roles in several leading technology publications, covering enterprise tech trends in cloud computing, AI, data analytics, cybersecurity and more. He regularly communicates with industry analysts and experts and has interviewed hundreds of technology executives. James is the Executive Director of TechVoices.
Stay Ahead with TechVoices

Get the latest tech news, insights, and trends—delivered straight to your inbox. No fluff, just what matters.

Nominate a Guest
Know someone with a powerful story or unique tech perspective? Nominate them to be featured on TechVoices.

We use cookies to power TechVoices. From performance boosts to smarter insights, it helps us build a better experience.