The AI Goldmine Hiding in Plain Sight
It’s not LLMs or chips so much as good old fashioned data center infrastructure. Plus: A fresh take on leveraging overseas tech labor from IT By Design.
A reminder: Last week’s post quoted a podcast interview I did recently with Kaseya CEO Fred Voccola about early adoption of the company’s Kaseya 365 solution and the “monster, monster, monster” follow-up announcement due to go public during this year’s DattoCon conference in October. That episode of the show (called MSP Chat, by the way) is now live here.
I’m biased, of course, but I recommend listening to the whole thing. If you want to skip ahead to the interview, though, it starts at the 20:44 mark, the part about K365 starts at 53:50, and the part about DattoCon starts at 1:07:50.
And now on to this week’s post.
AI infrastructure is unexciting and unignorable
The more I poke around in AI, the more I find myself wondering if we’re devoting too much attention to the wrong opportunities.
Well, not wrong exactly. I mean, NVIDIA’s revenue soared 262% year over year in its latest quarter, to $26 billion. That and the 33% revenue growth for AI chips this year that Gartner is projecting are both well worth some attention.
So is the over $52 billion businesses worldwide will spend annually on generative AI models and software by 2028, according to S&P Global Market Intelligence, and the $800 to $900 billion headed for AI startups by 2027, according S&P Global Rating.
But maybe all the excitement about AI processors, platforms, and software makers has blinded us to other less sexy but still lucrative opportunities. I’ve written recently about the money to be made securing AI data, for example, and 34% of corporate IT execs surveyed by IDC this year cited cyber-resilience/security as their top AI spending priority.
But that was actually less than the 35% who named infrastructure instead. Indeed, Dell’s Infrastructure Solutions Group recorded 22% annualized growth in its latest fiscal quarter, fueled by “demand strength across AI and traditional servers,” while HPE reported $4.6 billion of “cumulative AI systems orders” in its most recent financials.
Power management vendor Eaton is starting to see AI-related momentum too. We all know that AI solutions consume immense quantities of power. “That creates opportunity for our solutions with our partners” involving rack enclosures, power distribution units, and uninterruptible power supply hardware, notes Steve Loeb, the company’s vice president of distributed infrastructure sales.
“The chips are interesting, yes,” he says. “The servers are interesting, yes. The underlying technologies are very intriguing, but really nothing happens until the electrons move.”
Meanwhile, 24% of respondents to that IDC survey I mentioned before cited as their top AI investment priority another form of infrastructure spending all too easily dismissed as a commodity: storage and data management.
Which makes sense, if you think about it, for a number of reasons. Data privacy concerns have a lot of businesses using or planning to use LLMs of their own, rather than centralized offerings from OpenAI, Anthropic, and the like. That involves collecting a whole lot of training data and feeding it, preferably at high speed, to servers full of pricey NVIDIA processors that only pay off if they’re kept busy.
“You need to saturate those GPUs,” says Simon Robinson (pictured), a principal analyst and storage expert at Enterprise Strategy Group. “Data can be the bottleneck there, so in those environments the biggest challenge is can I get storage that’s fast enough?”
That’s just the training data too. Once a model’s trained, there’s also inference data to store—and archive, in case eight newspapers, say, sue you for stealing their intellectual property and you have to prove them wrong.
“All of that inference data needs to be maintained forever,” said Marty Falaro, chief revenue officer of online storage vendor Wasabi Technologies, during a recent episode of MSP Chat. “We believe that represents a massive opportunity for more cloud storage.”
Granted, the businesses most in need of storage space for training and inference data will tend to be on the bigger side. But most SMBs are going to need added storage capacity thanks to generative AI too, according to Greg Schulz, founder and senior analyst at consultancy StorageIO. How much more will vary a lot based on what your genAI is generating. If it’s mostly text, the extra capacity required might be “in the low single digits” as a percentage of what you’re consuming now, Schulz says.
“If you’re doing something where the content that you’re having it produce is much larger, you could be high teens, 20%.” You can forecast what to expect by thinking back to what happened years ago when people began storing audio, photographs, and video for the first time, Schulz notes.
“If your organization saw a 10% increase, you’re probably going to see something similar with generative,” he says. “If you saw a 25%, 50% jump when you started adding video and really high-resolution graphics and stuff like that, you should probably use that as your number and maybe even pad it a bit.”
That’s just the storage hardware, mind you. There’s additional revenue, and better margins, to be made in storage-adjacent services. “AI is about right data in the right place at the right time,” observes Robinson. “The storage challenges are not so much storage issues. They’re management issues.”
For example, he continues, companies that want to train private large language models generally discover that one of the biggest impediments is how fragmented their training data is. “Organizations generally don’t have a clear understanding of what they have where,” Robinson says. They’ll pay people to fix that, and to automate tasks like capacity management too.
“Storage infrastructure is still quite manual,” Robinson says. “Capacity is provisioned by people.”
Addressing issues like that mostly involves one-time project work, versus recurring revenue, but there will be many such projects for a long time to come as AI continues to evolve. “What we think of AI today is completely different from how we were thinking about it even two years ago,” Schulz observes. “It’s not standing still.”
Going global gets a look
Covid, as we all recall more vividly than we might prefer, set off an avalanche of remote work. Since then, most employers have summoned people back to the office for part of the week, but working from home remains an entrenched part of the employment scene, as data from the long-term Survey of Working Arrangements and Attitudes posted last Friday makes clear.
By my count, almost exactly as many people work three or more days from home as spend zero days there. Supporting those people has been an MSP money maker for over four years now. Hiring them as technicians has been helpful too as an answer to persistently low IT unemployment (which spiked unexpectedly to 3.7% last month, according to CompTIA, but still trails economy-wide joblessness).
Some of those remote techs work from homes not just up the street but in neighboring cities, counties, and states, moreover, a trend that’s had welcome psychological effects, according to Sunny Kaila (pictured), CEO of IT By Design, the “talent solutions” specialist for MSPs I discussed in a post last October.
“A lot of people have had lot more courage to try that out, and they gained confidence that it can work,” he says.
And work over longer distances than across state lines, Kaila adds, echoing a recent CNBC story that calls remote hiring a “gateway drug” to the next big thing in employment: “borderless” international recruitment.
Kaila has been introducing sometimes hesitant MSPs to borderless staffing for years. The lessons he’s learned along the way are the subject of his second book, due in September, The Secret to Building Winning Global Teams: How to Leverage Offshore Talent to Exponentially Increase Profitability and Valuation.
It includes ROI data on hiring overseas based on real-world case studies that Kaila let me peek at recently (minus names and other PII). They show IT By Design clients that outsourced technical work to India, the Philippines, or elsewhere lowering their labor costs anywhere from 15% to 60%, resulting in dual payoffs.
“Every dollar that you save from labor costs goes to profit, and the valuation of an MSP is normally a multiple of profit,” Kaila notes.
MSPs long had legitimate reasons to avoid global outsourcing despite those facts, he concedes. “The market wasn’t mature enough to meet U.S. service standards.” But that’s changed more recently, Kaila insists. India’s population is currently just over 1.4 billion, and 800 million of those people are under 30 years old.
“If you’re under 30, you were born with a phone in your hand,” Kaila says. “From birth, you start speaking English, you start watching Hollywood movies, you start watching all the shows online,” not to mention an endless stream of English-language Tik Tok videos. As a result, companies like IT By Design now have access to an almost limitless supply of trained IT administrators in India who speak the same language as SMBs here in America fluently, and share similar cultural reference points.
The upshot of that, Kaila believes, could be financially transformational for a generation of MSPs open minded about going borderless. I look forward to hearing him make that case at length a couple of months from now when the new book debuts at IT By Design’s Build IT Live conference in Orlando.
Also worth noting
Logically, the mega MSP in the making we wrote about a couple of weeks ago, has a security conference coming in October.
SentinelOne is now supplying risk assessment data to cyber insurer Aon.
Egnyte has a new built-in AI buddy (I’m tired of saying “copilot”) that chats, summarizes content, transcribes recordings, and more.
Speaking of AI buddies, Enso has shipped a collection of “Guided AI Agents” for SMBs.
We told you it’s got a shot! N-able has signed the CISA Secure By Design pledge.
Funding deals in cybersecurity rose 1% by volume and 71% by value in Q2, according to Pinpoint Search Group.
Just over three-fourths of MSPs saw an attack on their infrastructure in the last year, and just over half of them paid unplanned remediation bills as a result, according to Netwrix.
Eric Schott, formerly senior director of product management at Veeam, is now chief product officer at Object First, a maker of immutable storage appliances for Veeam backups.
Ronnie Tuttle is the new channel sales director at Trustifi.
Guardz, an all-in-one security vendor for SMBs, now integrates with ConnectWise PSA.
Digital experience management vendor ControlUp has joined Ingram Micro’s line card.
IGEL has added new support for Windows 365 Frontline edition, Microsoft Intune, and Microsoft Azure Stack HCI.
Badge and Cisco Duo are collaborating on MFA.