Innodata Inc. (INOD) on Q4 2022 Results - Earnings Call Transcript

Operator: Greetings. Welcome to Innodata’s Fourth Quarter and Fiscal Year 2022 Earnings Call. Please note this conference is being recorded. I will now turn the conference over to your host, Amy Agress. You may begin. Amy Agress: Thank you, John. Good afternoon, everyone. Thank you for joining us today. Our speakers today are Jack Abuhoff, CEO of Innodata and Marissa Espineli, Interim CFO. We will hear from Jack first who will provide perspective about the business and then Marissa will follow with a review of our results for the fourth quarter and the 12 months ended December 31, 2022. We will then take your questions. First, let me qualify the forward-looking statements that are made during the call. These statements are being made pursuant to the Safe Harbor provisions of Section 21E of the Securities Exchange Act of 1934, as amended and Section 27A of the Securities Act of 1933 as amended. Forward-looking statements include, without limitation, any statements that may predict, forecast, indicate or imply future results, performance or achievements. These statements are based on management’s current expectations, assumptions and estimates and are subject to a number of risks and uncertainties, including, without limitation, the expected or potential effects of novel coronavirus COVID-19 pandemic and the responses of governments, the general global population, our customers and the company thereto; impacts resulting from the rapidly evolving conflict between Russia and the Ukraine; investments in large language models that contracts may be terminated by customers, projected or committed volumes of work may not materialize; pipeline opportunities and customer discussions, which may not materialize into work or expected volumes of work; acceptance of our new capabilities; continuing Digital Data Solutions segment reliance on project-based work in the primarily at-will nature of such contracts and the ability of these customers to reduce, delay or cancel projects; the likelihood of continued development of the market, particularly new and emerging markets that our services and solutions support; continuing Digital Data Solutions segment revenue concentration in the limited number of customers; potential inability to replace projects that are completed, canceled or reduced; our dependency on content providers in our Agility segment; a continued downturn in or depressed market conditions, whether as a result of the COVID-19 pandemic or otherwise; changes in external market factors; the ability and willingness of our customers and prospective customers to execute business plans that could rise to requirements for our services and solutions; difficulty in integrating and driving synergies from acquisitions, joint ventures and strategic investments; potential undiscovered liabilities of companies and businesses that we may acquire; potential impairments of the carrying value of goodwill and other acquired intangible assets of companies and businesses that we may acquire; changes in our business or growth strategy; the emergence of new or growth in existing competitors; our use of and reliance on information technology systems, including potential security breaches, cyber attacks, privacy breaches or data breaches that result in the unauthorized disclosure of consumer, customers, employee or company information or service interruptions; and various other competitive and technological factors and other risks and uncertainties indicated from time to time in our filings with the Securities and Exchange Commission, including our most recent reports on Form 10-K, 10-Q and 8-K and any amendments thereto. We undertake no obligation to update forward-looking information or to announce revisions to any forward-looking statements except as required by the federal securities laws, and actual results could differ materially from our current expectations. Thank you. I will now turn the call over to Jack. Jack Abuhoff: Thank you, Amy. Good afternoon, everybody. Thank you for joining our call. Today, I am going to talk briefly about our Q4 and year-end results. And then I’m going to spend some time discussing recent acceleration in AI investment by large technology companies in large language models coinciding with OpenAI’s fourth quarter release of its large language model called ChatGPT and how we believe Innodata is quite well positioned to capitalize on this increased investment. So first, our results. Q4 revenue was $19.4 million, a 5% increase over the prior quarter, which annualizes roughly to a 22% growth rate. We posted positive adjusted EBITDA of approximately $250,000 in the quarter, which was a positive swing of $1.5 million from Q3. This significant improvement resulted primarily from our September/October cost containment and efficiency initiatives. The benefits of these initiatives will be fully reflected in our first quarter 2023 results. We ended the year with a healthy balance sheet, no appreciable debt and $10.3 million in cash and short-term investments on the balance sheet. In 2022, overall, we grew revenues 13% despite the significant revenue decline from our large social media customer that underwent significant internal disruption in the second half of the year, but that we believe may normalize this year. Let’s now shift to the recent substantial uptick we are seeing in our market activity. As most everyone now knows, in late Q4, OpenAI unveiled ChatGPT. This AI large language model has since gone viral, capturing popular imaginations for its ability to write, to generate computer code and to converse at what seems like human or even superhuman levels of intelligence. We believe the release of ChatGPT has been broadly seen as a watershed event potentially heralding a fundamental advancement in the way AI can drive changes in business communication, processes and productivity. Our market intelligence indicates that many large tech companies are accelerating their AI investments as they compete for domination in building and commercializing large language models and that an arms race of sorts is now forming. We believe that the significant investment that will likely result from this competition could dramatically accelerate the performance of these large language models. As a result of this dramatic increase in performance, we expect almost every industry will face fundamental reinvention. We believe that the opportunity for Innodata in all of this is significant and that it is now upon us. We believe our opportunity is actually threefold: first, to help large technology companies, both existing customers and new customers, compete in this large language model arms race; second, to help businesses incorporate large language models into their products and operations; and third, to integrate these technologies into our own platforms. Let’s take each of these in turn, starting with what I just laid out as our first opportunity, helping technology companies, both existing customers and new customers, compete in the large language model arms race. While ChatGPT and a host of lesser-known but equally impressive large language models are for sure amazing, our view is that it’s still early days. We believe these large language models have room for significant improvements in output quality, in the languages they serve, in the domains they support and in terms of safety. These are all challenges that we believe we can help with. We expect to help by collecting large-scale, real-world data for training; by creating high-quality synthetic data when real-world data training is hard to come by; by annotating training data; and by providing reinforcement learning from human feedback, or RLHF, to fine-tune model performance and eliminate hallucinations, which is the tendency of these models to make things up on the fly. In addition, we expect to help by minimizing the risk that models generate unsafe or biased results, and we expect to help by hyper-training generalized models for specialized domains. High-quality data is at the root of addressing all of these challenges, and this is and has been Innodata’s bread and butter specialty for 30 years. We believe that the arms race to which I’m referring has likely already begun. In just the past few weeks, it seems, activity for us has dramatically surged. We are now either expanding work with, beginning work with or discussing working with 4 of the 5 largest technology companies in the world. I am going to share some examples of the surge in activity we’ve seen in just the past few weeks. A major cloud provider, whose AI needs we began serving 24 months ago, engaged us to help them build a new large-scale generative AI model for images. We started the initial phase of this just this week. In addition, the customer asked us just last week to kick off a pilot to support their generative AI large language model development. We started the pilot this week. With the same customer, also in the last few weeks, we expanded our synthetic data program to support its large language model development. We believe high-quality synthetic data is likely to be a key ingredient to performing – to high-performing large language models of the future. Synthetic data is entirely new data that we generate through a machine-assisted process to match real-world data and maintain all of the statistical properties of real-world data, which is especially useful for capturing rare cohorts and outliers of interest. Synthetic data is also helpful to correct for data bias, to improve algorithmic fairness and to avoid having to retrain proprietary or confidential data. We started working with synthetic data back in 2022 and we have been continually improving our capabilities and technologies for synthetic data creation since then. With this customer, we’ve gone from serving just one of their product lines to now being firmly engaged with three product lines and we are in pilots with three additional product lines. Also in the past couple of weeks, with another of the world’s largest tech companies, this one a company that would be a new customer for us, we’ve gotten a verbal commitment to assist them on projects relating to large language models. They have told us that they are in the final stages of putting in place a statement of work. While there can be no assurance that the SOW is put in place, based on our current estimations and assumptions, value of this program could potentially approach approximately $1.8 million per year run rate in its initial phases and could ramp up to approximately $6 million per year as it gains momentum. In addition, 2 weeks ago, one of the world’s largest social media companies, another potential new customer for us, reached out to discuss how we might potentially support its large-scale model development. It has been referred to us by one of our existing customers who apparently said that we could be helpful in unlocking value, unlocking scale and by bringing a consultative approach to a partnership. We believe that the opportunities I’ve just mentioned individually and in the aggregate are potentially very large. I want to underscore that several of these are pipeline opportunities and at various stages of pipeline from early stage to late stage. Pipeline opportunities are inherently difficult to forecast and often do not close. That said, I’ve offered them here in support of two beliefs: the first belief that there is building momentum among big tech companies for AI innovation generally and large language model specifically; and the second belief that Innodata’s reputation for high-quality work with high-quality outcomes is becoming firmly instantiated in a dynamic market that is viewing us as a potential partner in one of our generation’s greatest innovations. Now, let’s shift to our second significant market opportunity. We believe that our second significant market opportunity is to help businesses harness the power of these foundational generative AI models. Most enterprises have tasks that generative AI can make easier. As the technology improves, and we expect it will, we believe that businesses will see incorporating the technology as a must-have rather than a nice-to-have. Analysts are predicting that this year, the most forward-thinking business leaders will be actively putting time and money into reimagining their products, service delivery and operations based on what AI can do for them, leading to widespread deployments over 2024 and 2025. What we are also hearing, especially from CTOs, is that their biggest roadblock to deploying AI is finding the right engineers and data scientists to help them get there. We believe our opportunity will be to do just that to help them get there. We anticipate that this will take the form of fine-tuning existing pre-trained large language models on specific tasks within specific domains, bringing expertise in prompt engineering, the art of prompting large language models to produce the appropriate results, and helping with large language model application integration. Early in the first quarter of 2023, a large financial technology company expanded scope with us to leverage our proprietary AI models more fully and reengineer their technology for the cloud to drive operational efficiencies. Our proprietary AI engine, Goldengate, uses the same underlying encoder-decoder transformer neural network architecture as GPT. While GPT is trained broadly, Goldengate is trained narrowly on specific tasks and domains. We have experimented with coupling GPT and Goldengate, and this seems to result in even higher orders of performance. This is the third scope expansion we’ve had with this company over the course of the past 6 months, again, providing further validation of our land-and-expand strategy. We believe our third opportunity is to harness GPT and other large language models in our own AI industry platforms. Just last month, we announced PR CoPilot, a new module within our Agility PR platform that combines proprietary Innodata technology and GPT to enable communications professionals to generate first draft of press releases and media outreach in record time. With our release of PR CoPilot, we became, we believe, the PR industry’s first integrated platform to incorporate large language model technology. The implementation was significant for Innodata, and we received a supportive write-up in PR weekly for it. The start-up named Jasper vaulted to unicorn status when it implemented something very similar to PR CoPilot for creating blogs and social media postings. Their efforts got them a $125 million Series A round on a healthy $1.5 billion valuation. With respect to our Agility platform, we are seeing positive momentum in key performance indicators, which we think PR CoPilot and our newly integrated social media listening product will help to further accelerate. In Q4, Agility platform sales grew 6% over Q3, which annualizes to a roughly 26% growth rate. In 2022 overall, our direct sales new logo bookings increased by 83% year-over-year and our direct sales net retention increased to 100%. In 2022 overall, approximately 83% of our Agility revenue came from direct sales, and 17% of our revenue came from channel partners. In Q4, our conversion from demo to win in direct sales increased to 33%, up from approximately 18% at the beginning of the year. We believe the notion that customers who use us love us is also very much apparent in our Synodex platform. Synodex grew by 71% in 2022 with a net retention of 168%. We announced in Q4 that one of our large Synodex customers had expanded its recurring revenue program with us. In the announcement, we stated that the expansion was valued at approximately $600,000, but we now believe the value of the expansion is actually closer to $1.2 million. This is now our second-largest Synodex customer with an estimated annual recurring revenue base of $2.3 million. This year, we will be focused on product development to expand our addressable market for medical data extraction. We’ve got new products currently being evaluated by charter customers in disability claims processing, personal injury claims processing and long-term care claims processing as well as in clinical medical data annotation and fully automated life underwriting. Integrated AI will be a feature in all of these products. We are more enthusiastic than ever about our market opportunity and the intrinsic value of our business. In our last call, we said we anticipated expanding our adjusted EBITDA to $10 million or more in 2023 and, at the same time, capturing significant growth opportunities. We believe the activity we are now seeing in our markets will likely enable us to achieve this and potentially more. I will now turn the call over to Marissa to go over the numbers, and then we’ll open the line for some questions. Marissa Espineli: Thank you, Jack. Good afternoon, everyone. Let me recap the fourth quarter and fiscal year 2022 financial results. Our revenue for the quarter ended December 31, 2022, was $19.4 million compared to revenue of $19.3 million in the same period last year. Our net loss for the quarter ended December 31, 2022, was $2 million or $0.07 per basic and diluted share compared to a net loss of $1.2 million or $0.04 per basic and diluted share in the same period last year. The total revenue for the year ended December 31, 2022, was $79 million, up 13% from revenue of $69.8 million in 2021. Net loss for the year ended December 31, 2022, was $12 million or $0.44 per basic and diluted share compared to a net loss of $1.7 million or $0.06 per basic and diluted share in 2021. Adjusted EBITDA was $2 million in the fourth quarter of 2022 compared to adjusted EBITDA of $0.3 million in the same period last year. Adjusted EBITDA loss was $3.3 million for the year ended December 31, 2022, compared to adjusted EBITDA of $3 million in 2021. Our cash and cash equivalents and short-term investments were $10.3 million at December 31, 2022, consisting of cash and cash equivalents of $9.8 million and short-term investment of $0.5 million, and $18.9 million at December 31, 2021, consisting of cash and cash equivalents. So thanks, everyone. John, we are now ready for questions. Operator: Thank you. The first question comes from Tim Clarkson with Van Clemens. Tim, please proceed. Tim Clarkson: Hi, Jack. Apologize, if you head noise in the background, we had a major snowstorm in Minneapolis and the snow is coming off our buildings, so anyhow. The first question I have is, what exactly is the technology experience that makes Innodata successful at moving forward artificial intelligence in these chatbots? Jack Abuhoff: Sure, Tim. Well, it’s not limited to chatbots. Artificial intelligence, people believe and I firmly believe, is at a kind of a fundamental inflection point. We’re now seeing the kinds of technologies that people have dreamt about probably since the 1950s. And when you think about building these technologies and think about what goes into them, it’s not programming in the traditional sense, it’s data. It’s high-quality data and data that can help to address some of the fundamental problems that these technologies have. They need to improve their output quality. They need to improve languages they are supporting. They need to be customized for particular domains. They need to improve what we think of as safety, the kinds of responses, the kinds of things that they tell us. So what needs to be done for that? The things that need to be done are the things fundamentally that we’ve done for a very long time very, very successfully for some of the largest companies out there. When we were being retained by the large engagements we’ve had in the past, things like Apple, what we were doing for them was fundamentally building large quality data or high-quality data but for products and for publishing. Here, we’re building it because data is the programming language of AI and the programming language of large language models. Tim Clarkson: Right, right. And what typically are the gross margins on the revenues you’re getting from this? Are these high gross margin products? Jack Abuhoff: I think from a gross margin perspective, I would continue to expect a range of gross margins from our different capabilities. In the services sector, I think mid-30s to mid-40s gross margins are achievable, and they’ll get better over time. As we introduce automations in technology, they tend to drift higher. When we’re starting up new projects, they tend to drift a little bit lower. On the platform side, they are higher than that. Incremental gross margins especially can be very substantial. And as we build scale and start to scale on our fixed costs, we can start to see the kinds of gross margins that will emerge from those kinds of business models. Tim Clarkson: Right, right. Well, just one last comment. I’ve been all excited about Innodata lately. And the analogy I use is it’s what’s happening with artificial intelligence is before it was sort of like da Vinci seeing a picture of an airplane. But it’s one thing to see a picture of an airplane, it’s another one to see one fly by you and go from Minneapolis to New York. And when you actually see this artificial intelligence stuff work, you no longer have to be sold on the value. I mean it’s magical. So that’s, for me, that’s the difference as people are really excited about the end product, and emotion is what drives ultimately decision making, and there is real excitement behind this. So with that, I’m done. Jack Abuhoff: Thank you, Tim. Operator: The next question comes from Dana Buska with Feltl and Company. Please proceed. Dana Buska: Hi, Jack. How are you today? Jack Abuhoff: Dana, I am doing great. Thank you, welcome to the call. Dana Buska: You are welcome. Yes, thank you for taking my questions. My first question is with your new PR CoPilot. Is – you talked about your own technology, and I was just wondering if your technology is more than just accessing an API at OpenAI? Jack Abuhoff: So it is. I mean it’s – basically, what we’re doing – and we’ve got a very exciting road map. We really believe we’ve only just got started with ChatGPT. But what we’re doing is we’re combining an API with prompt engineering that we’ve done behind our UI and behind the scenes. And in the future, what we’re going to be doing is we’re going to be enriching the training data to more specifically perform within the domain. We’ve also done other things in terms of that, and we will be doing other things in order to further enrich the experience. So when you take the two fundamental use cases we’re addressing, one for writing press releases and one for writing media outreach, we’re looking at what goes on in someone’s head when they are looking to do either one of those things. What connections are they making? Where are they having to go and research? And built into our product and built into our underlying data model are lots of connections that we’re able to harvest and bring them into the prompts in order to create a more precise level of output. So it’s very much a combination of these things. Now one of the reasons – so I’m excited for several reasons, but one of the things that it’s enabling us to do is to create a lot of value for PR but, at the same time, learn a lot about how do you integrate these technologies to create a superior customer experience. And we’re able to bring that experience in turn to the work that we’re doing for other customers. So it’s great fun and it really is a new frontier. Dana Buska: That sounds wonderful. I was wondering if you were going to be able to apply your CoPilot technology to other industries or other companies or other applications? Jack Abuhoff: So we’re certainly talking to other companies about this. We’re looking at some opportunities to apply large language models within our Synodex platform. That’s kind of early days though, so I wouldn’t – I’d encourage you not to expect anything out of that very quickly. We think the road map, though, for PR CoPilot is fairly extensive, and we’ve really only begun down that path. So we’re very excited about that. Dana Buska: Excellent. With Agility, where are you projecting breakeven for that is going to be? Jack Abuhoff: So I don’t think we’ve put out a number relative to that. But what we’ve said is that we think we are going to be getting that business to become an adjusted EBITDA positive business in the first half of the year. Dana Buska: Okay. Excellent. And in the past, you’ve given out projections for the first quarter. And I don’t know if I missed it or not, but do you have any type of projection in revenue growth that you’re thinking for the first quarter? Jack Abuhoff: So we’ve decided not to – this year, for the most part, we’re probably not going to provide forward-looking guidance only because of the level of activity that’s going on in the business. It’s so substantial and coming on so strong that being able to reduce these things to forecasts and know when something is going to close and know what it’s going to look like and how it’s going to ramp up is almost impossible. So the likelihood that we would be wrong and maybe even significantly wrong is pretty substantial. Therefore, we’ve gone the other direction, which is kind of – which I think is materially disclosing here’s what’s going on in the business. Here’s what’s coming our way. We are now working with four of the five largest technology companies who are fundamentally driving what will probably be the innovation of our lifetimes. We will no doubt look back on it as that. And we’re partnering with them. We’re involved, and that opportunity is coming our way. So long story short, we’re not – we’re going to stay out of the guidance business right now, but we’re going to try to disclose what we’re doing and the level of activity that we’re seeing. Dana Buska: Okay. Can we anticipate it will be a growth year? Jack Abuhoff: Yes, we’re very much focused on growth. So yes, it’s the easy answer to that. Dana Buska: Okay. Excellent. Alright, thank you. Operator: The next question comes from Marco Petroni with MG Capital Management. Please proceed. Marco Petroni: Hi, Jack. How are you? Jack Abuhoff: Marco, hi. How are you? Marco Petroni: Good. A couple of questions. One, everybody has AI and machine learning algorithms, and they put them to different uses. But is there any company out there that you know of that combines that with the ability that you have on the data side with regards to organizing, collecting and overlaying synthetic data on top of that? Is there anybody out there, including the big guys, that can do that? Jack Abuhoff: Well, I think there are. I think there are a couple of companies that are doing some things that are similar to us, though not very many. We’ve kind of got a view of the world that we can do two things well. And we think that there is like a virtuous circle that forms when we do the three things well. First is AI data preparation. We’re helping large companies accelerate their ability to innovate in AI by doing the things that we do on the data side. Second thing is we’re then helping deploy those models and integrate them into people’s businesses. So we’re helping build the models and then we’re helping integrate the models. And then thirdly, we have our own platforms, and we’re learning the hard way. We’re eating our own dog food, and we’re figuring out how to do it for ourselves first so we then develop the expertise to bring to both the data collection. Well, how do you collect data in a way that results in high-performing models? And then on the model deployment, how do you best deploy models in legacy workflows and legacy systems? How – what are the opportunities for reinvention that you can bring to bear? Marco Petroni: The ChatGPT it’s great to go in, but I’ve had experiences where I put the same data and it gave different answers. Obviously, that can’t be used within a company. Are you guys – do you guys have the same capabilities of OpenAI in terms of creating those type of ChatGPTs within an organization specifically for an organization, so for example, their call centers or internally to be able to use that to interact amongst employees as well as customers? Jack Abuhoff: Yes. So the essential architecture behind GPT is an architecture that is also behind our proprietary Goldengate technology. Now do we have the same ability to stand up something that performs the way that, that one does in a generalized way? Absolutely not. We don’t have the budgets. Hundreds of millions of dollars was likely spent on getting it trained to the level that it’s been trained on. There is a tremendous amount of data that gets poured in to create the billions and hundreds of billions of parameters that drive model. A tremendous amount of cloud processing went into that. We cannot do that. But what we can do and what’s the future of the way these things will work is we can build on those. We can train them. We can customize them. We can use what’s called reinforcement learning from human feedback in order to train customized – domain-customized models with private side data to enable them to perform better. That’s really going to be the future of this. So we will see. The big companies with the large language models proprietary to them large language models, we will help them build those, but we won’t be able to fund those ourselves. But then what we will be able to do is customize them and build upon them in order to create business outcomes for people. Marco Petroni: And one – just one last question, you guys are trading at $200 million, roughly 2x revenue. What is the company going to do going forward now that – I mean, obviously, AI is everywhere. What is the company going to do to market our stock basically and get out there? I know earnings and revenue are great. But to get out there and do that, what do you have planned in the next coming months and quarters? Jack Abuhoff: So I think the most important thing we can do for our shareholders and, of course, I’m prominent among our shareholders, is to continue to do the things that we’re doing. The fact that we’re in four out of five of the largest technology companies helping to develop what will be a transformative technology that’s still in its infancy and will need a lot of work over the next several years, that we’ve gotten there and we’re doing that, I think, is huge that we’ve gotten there and are doing that without having to go out into the markets and dilute our equity and raise a ton of debt is, I think, it’s an impressive feat. Now how do we better promote that? I think the first thing starts with execution, and then what follows from that is lots of conversations with people who I’m hoping will be attracted to the execution that we’re bringing to bear. And of course, looking at some of the techniques that people use, conferences and outreach and talking to analysts and all of those things, but fundamentally, we’re going to be about execution. Marco Petroni: No, absolutely. But I mean I’ve been a shareholder for 2 years, and nobody really knows about us. Every other AI company is trading at a 5, 10, 15 multiple on revenue. We’re trading at 2x. And going forward, we have the potential of growing at 30% plus. Seems like we’re pretty undervalued here compared to the sample. But thank you. Jack Abuhoff: Thanks, Marco. Operator: The next question is – comes from Craig Samuels with Samuels Capital Management. Please proceed. Craig Samuels: Hey, Jack, how are you? Jack Abuhoff: Hey, Craig, how are you? I am well, thanks. Craig Samuels: Pretty good. Thank you. Back in several quarters ago, you talked about your total number of sales reps for both Agility and then service solutions side. Where do you stand today with the numbers of sales reps? Jack Abuhoff: So on the services solutions side, on the AI side, we’re about at the same number that we last shared. We’ve taken down the number quite a bit on the Agility side. And a couple of things went into that decision, first of which was in the beginning of the year, we were having a hard time retaining people very specifically in one of the sales offices that we put up. It was – there was a labor shortage. That was pretty well known. We were in Austin, Texas, where a lot of SaaS companies were, and they were overpaying, as far as I’m concerned, for talent. And we didn’t want to play that game. So what we decided and said was, let’s be good stewards of capital. Let’s not play the game of overpaying for talent. Let’s instead work with a smaller number, and we’ve had great success retaining very talented salespeople in others of our offices. Let’s retain them. Let’s work on it. Let’s build a sales organization that has very much a data-driven approach to sales and to optimizing our customer experience and build from there. And I think that’s proving to be the right decision. As we look out at Agility, and we kind of see what’s going on now, we see new logo booking up 83% year-over-year, our net retention going from the 90s up to 100% now. Significant performance improvement relative to the number of demos that we do that end up in closed sales going from like 18% at the beginning of the year to 33% now. So with that, we will, I believe, see acceleration in growth. And it’s always easier to throw logs on a fire that’s burning strongly. So that’s kind of where we are. Craig Samuels: Right. So I don’t remember the last numbers that you had. Can you share them for services? Going back in time, I seem to remember, it was like 6 or 7. And on the Agility side, you had a target of 110, and the last numbers I have in the 67 zone. It’s been a little while. Can you actually share the numbers? Jack Abuhoff: Sure. I think what we said was we had 9 folks, quota-bearing executives in the services solutions area. And then we had a combination of about 90 people, 42 quota-bearing people in Agility and another 37 BDRs. We’ve – in Agility, we’ve reshuffled those numbers pretty considerably in terms of the mix and the workflows, and we brought that number down quite a bit. I don’t have the current number to share with you. Happy to do that off the call when I go get that, but we brought that down, and we’re getting the performance off of that smaller cohort which is, at the end of the day, what it’s all about. Craig Samuels: Right. And then on the 9 service sales reps, I recall from, again, this is probably 2 years ago, the Jack Abuhoff: Craig, sorry, I think you dropped off. Operator: One moment, I’ll reconnect Craig. Jack Abuhoff: Okay. Operator: Craig, can you hear us? Craig Samuels: I can hear you. Are you there? Operator: Okay, your line is live. Craig Samuels: Not sure what happened, but... Jack Abuhoff: Yes, Craig, I’m not sure what’s happening. Craig Samuels: I had asked about the productivity, and I seem to remember about $1.5 million of quota per service AI sales rep. Is that still consistent with where you guys are today? Or has that number gone up or down? Jack Abuhoff: Yes. So I think we put that out there where they average the – in the services and solutions area, the quotas are actually derived from the account assignments. So depending upon the accounts that people are working on there, they can be fairly significantly different from that. They can be much higher than that. And an entry-level person who’s kind of building his account base can be lower than that. Craig Samuels: Got it. And then also Nvidia had some news regarding the data center providing computer power for AI and just wondering if that helps you or if that’s competitive? Jack Abuhoff: No, I think it’s very much supportive of the value position. I was on their call and I remember what was said on the call. And I think the way they are viewing the opportunity is very much the way we’re viewing the opportunity. They are looking at it from a different perspective. They are looking at it from a perspective of enabling it from a processor side. We’re looking at it from a perspective of enabling it with data. And these are two sides of the same coin as far as I’m concerned. Craig Samuels: Yes, that’s exactly what I thought. Just wanted to hear you confirm that. And then lastly, would you expect your gross margins to increase over the next 12 to 36 months? Will there be a greater software component? Or will it still be heavily weighted towards services? Jack Abuhoff: I think it depends on kind of what happens. Given the activity that we’re now seeing of great significance, I think we will continue – if we’re successful at closing the opportunities that are before us, I think we’re going to continue to see a very heavy weighting towards solutions and services from a consolidated margin perspective. I’m going to live with that problem, though. Craig Samuels: Right. So that means still in the below 40%? Jack Abuhoff: Not necessarily. I think that as we start to execute the plan, we will be able to move above 40% over time, but probably still below 50%. Again, it remains to be seen what we’re able to deliver on the platform side. But if the opportunity is as large as we’re hoping it is on the solutions side, I think it will weigh towards that. Craig Samuels: Yes. Sounds good. Keep up the good work, and look forward to continuing to monitor your progress over time. Thanks. Jack Abuhoff: Thank you. Operator: We have reached the end of the question-and-answer session, and I will now turn the call over to Jack for closing remarks. Jack Abuhoff: Thank you, operator. So yes, I’ll quickly recap. We’re seeing very recent acceleration in AI investment by large tech companies. It seems to be coinciding with OpenAI’s release of ChatGPT. We’re now either expanding work with or beginning work with or discussing starting work with four of the five largest tech companies in the world. And much of what is under discussion has to do with building and improving large language models. I’m very excited about where we are with these companies and excited about where we are with a host of similarly impressive companies across other domains. Even though forecasting exact close dates remains challenging, we think we’re in the right place at the right time to ride this wave. We’re seeing positive trends across our other business segments as well. Synodex growth last year was huge. It’s well positioned to expand in this market this year. And Agility, supported by what we believe was a very successful release of PR CoPilot, we continue to make great strides in win rate, net retention bookings, all of which are, of course, leading indicators of accelerating growth. So, very excited to be here today, very excited with the news that we’re sharing today. And thank you all for participating in this call. We will look forward to our next call with you. Operator: This concludes today’s conference, and you may disconnect your lines at this time. Thank you for your participation.
INOD Ratings Summary
INOD Quant Ranking
Related Analysis