Innodata Inc. (INOD) on Q4 2023 Results - Earnings Call Transcript

Operator: Greetings. Welcome to Innodata’s Fourth Quarter and Fiscal Year 2023 Earnings Call. At this time, all participants are in a listen-only mode. A question-and-answer session will follow the formal presentation. [Operator Instructions] Please note, this conference is being recorded. I will now turn the conference over to your host, Amy Agress. You may begin. Amy Agress: Thank you, John. Good afternoon, everyone. Thank you for joining us today. Our speakers today are Jack Abuhoff, CEO of Innodata; and Marissa Espineli, Interim CFO. We’ll hear from Jack first, who will provide perspective about the business, and then Marissa will follow with a review of our results for the fourth quarter and the 12 months ended December 31, 2023. We’ll then take your questions. Before we get started, I’d like to remind everyone that during this call, we will be making forward-looking statements, which are predictions, projections or other statements about future events. These statements are based on current expectations, assumptions and estimates, and are subject to risks and uncertainties. Actual results could differ materially from those contemplated by these forward-looking statements. Factors that could cause these results to differ materially are set forth in today’s earnings press release in the Risk Factor section of our Form 10-K, Form 10-Q and other reports and filings with the Securities and Exchange Commission. We undertake no obligation to update forward-looking information. In addition, during this call, we may discuss certain non-GAAP financial measures. In our SEC filings, which are posted on our website, you will find additional disclosures regarding these non-GAAP financial measures, including reconciliations of these measures with comparable GAAP measures. Thank you. I’ll now turn the call over to Jack. Jack Abuhoff: Good afternoon, everybody. We’re very excited to be here with you today, as we have a lot of good news to share. We are pleased to announce fourth quarter 2023 revenues of $26.1 million, representing 35% year-over-year growth and 18% sequential growth. We exceeded our guidance of $24.5 million by 6.5% as a result of strong customer demand for generative AI services and our ability to ramp up quickly to meet customer demand. In 2023 overall, we grew revenues 10%. Now, it’s worth noting that our Q4 2023 year-over-year revenue growth was 39% versus 35%, and our year-over-year revenue growth was 23% versus 10%. If we back out revenue from the large social media company, that went through a highly publicized take private in 2022, in conjunction with which it terminated our services, as well as services from many of its other vendors and laid off 80% of its staff. This customer contributed $8.5 million in revenue in 2022 and $0.5 million in revenue in Q4 of 2022. Beginning in Q1 2024, revenue from this customer will no longer provide a drag on year-over-year comparisons. We are also very pleased to announce fourth quarter adjusted EBITDA of $4.3 million, exceeding our guidance of $3.7 million by 16%. Growth in Q4 was driven primarily by ramp of generative AI development work for one of the Big Five tech companies we signed mid-2023 and also benefited by the start of the generative AI development program with another of the Big Tech customers we announced late last summer. In late Q4, the first customer I mentioned signed a three-year deal with us for our current, initial program, with an approximate value of $23 million per year for each of 2024, 2025 and 2026, or $69 million for the three years, based on the not-to-exceed value of the statement of work. We’re very proud of this achievement. It came with customer kudos for the work we’ve done and expressions of interest in expanding the partnership further. That said, and as a cautionary note, investors should understand that there are a number of ways under the SOW that the customer could terminate early or reduce spend if it chose to. We believe the quality of our services will always be the key to enduring customer relationships, not the stated value or term of a contract. We’re off to a strong start to 2024. We entered the year with master service agreements in place with five of the so-called Magnificent Seven technology companies. With two of these companies, we are now solidly underway. A third also contributed to Q4 growth, with a more significant ramp-up from this customer starting this month. We are optimistic we will grow revenues with all three of these customers in 2024. With the remaining two of the five Mag Seven customers, we’ve barely gotten out of the gate, but we are optimistic about making significant inroads this year. We are also in conversations with several additional companies, including some of the most prominent leaders in generative AI today. We believe we have the strategy, business momentum and customer relationships to deliver significant revenue growth in 2024. We will stick to our annual growth target of 20% in 2024 with the intention of over-achieving this. In 2024, we will target two broad markets. The first is Big Tech companies that are building generative AI foundation models and we believe are likely to spend significantly on generative AI development. For these Big Tech companies, we provide a range of services they require to support their gen AI programs. One of these services is the creation of instruction data sets. You can think of instruction data sets as the programming used to fine tune large language models. Fine tuning with instruction data sets is what enables the models to understand prompts, to accept instruction, to converse, to apparently reason and to perform the myriad of incredible feats that many of us have now experienced. We will also be providing reinforcement learning and reward modeling, services which are critical to provide the guardrails against toxic, bias and harmful responses. In addition, we are also involved in model assessment and benchmarking, helping ensure that models meet performance, risk and emerging regulatory requirements. Based on my conversations with several of these companies, as well as public remarks they have made, we believe they are likely to spend hundreds of millions of dollars each year on these services. This spend is separate from and in addition to their spend on data science and compute, the other essential ingredient of high-performing large language models. Our second target market is enterprises across a wide range of verticals that seek to integrate and fine-tune generative AI models. These are still early days in terms of enterprise adoption of generative AI, but we believe that a decade from now virtually all successful businesses will have adopted generative AI technologies into their products and operations. For enterprises, our offerings including business process management, in which we re-engineer workflows with AI and LLMs and perform the work as ongoing managed services. We also offer strategic technology consulting, where we work with customers to define roadmaps for AI and LLM integration into both operations and products and build prototypes and proofs-of-concept. We also fine-tune models, both in isolation and as part of larger systems that incorporate other technologies. For enterprises, we are capable of going soup-to-nuts, everything from initial consulting to model selection to fine-tuning, deployment and integration, as well as testing and evaluations to ensure that the LLMs are helpful, honest and harmless. Also for enterprises, we offer subscription-based platforms and industry solutions that encapsulate AI, both our own models and leading third-party models. Much the way data is at the heart of the programming-like work we do for Big Tech, data is similarly critical to enterprise deployments. Enterprise use cases tend to be highly specific and targeted, requiring models that are trained with industry-specific or domain-specific data or that require significant prompt engineering efforts and in-context learning utilizing carefully curated and organized company data. The bottomline here is that data engineering is important for the Big Tech companies building generative AI foundation models and the enterprises adopting these technologies. Data engineering has been our focus for the past two decades and we believe we are quite good at it. I am going to take few minutes now to respond to some questions I’ve been asked by investors recently. Number one, several investors have asked whether we currently anticipate needing to raise additional equity. The answer is no, we do not currently anticipate needing to raise additional equity. We ended Q4 with $13.8 million in cash and short-term investments, slightly down from $14.8 million last quarter, but that was largely due to timing, as we had $2.5 -- $2.4 million in cash receipts from major customers collected right after the New Year and we generated over $4 million of adjusted EBITDA in Q4 alone. Nonetheless, to support our growth and future working capital requirements, we have a revolving line of credit with Wells Fargo that provides up to $10 million of financing, 100% of which was available under our borrowing base as of the end of Q4. We have not yet drawn down on the Wells Fargo line. We anticipate generating enough cash from operations in 2024 to fund our capital needs without having to draw down on the Wells Fargo facility. Number two, several investors have asked why we have no Chief Financial Officer. Well in a sense we actually have four Chief Financial, excuse me, Chief Technology Officers or at least their equivalents, each of which manage a specific technology area; we have a PhD in computer science and AI who heads our AI labs research team and data science teams; we have an SVP of engineering overseeing product and platform engineering; we have another VP focused on software development and product evolution for our Agility product; and we have a Chief Information Security Officer who heads security and infrastructure. Under these leaders, we have close to 300 developers, architects, infrastructure managers and data scientists. We have found that this structure best supports the breadth and scale of our business. Investors have asked us to share our recent spending on software and product development and have asked why do we not separately disclose it to comment on whether we have a significant spend on cloud infrastructure. So, there are three separate questions there and I’ll address each. In terms of our spending across software and product development, over the last five years, we spent about $26 million. This peaked in 2022 at $8.9 million and came down to $6.4 million in 2023. However, since roughly 80% of our business is managed services, we do not view the aggregate spending across these areas as a focal point for investors. In terms of cloud, we spend a couple of million dollars per year, mostly for software, infrastructure and data hosting. It is our Big Tech customers, not us, that spend massively on GPUs for training foundation models. Other investors have asked us how they should think about our comps. Specifically, they asked whether our comps are companies like OpenAI, Google and Meta, and whether they should compare our R&D spend and cloud compute spend to these companies. These companies are absolutely not our comps. Rather many of these companies constitute part of our target market. We are not in their business, and to state the obvious, we are not of similar scale. Players in this market are building foundation models and we are providing services to this market that help them on their journey. Therefore, we do not believe that comparing our R&D spend and cloud compute spend to theirs is especially useful. We view our competition as companies focused on AI data engineering services to this market, like ScaleAI and others, and companies more broadly focused on technology services but also focused on AI data engineering, like Accenture and Cognizant. Another question I’ve gotten is how do we manage to pivot to AI without having to raise substantial capital? There are essentially three reasons we were able to pivot to AI without having to raise capital. The first reason, which we believe is by far the most important, is that the massive spend we read about being required to build foundation models is incurred by our large tech customers, not by us. Our customers are deploying extensive amounts of capital for cloud compute, for data science and for data engineering, three crucial ingredients to an LLM, if you will. We provide the kinds of data engineering services they need and providing data engineering does not require that we separately incur compute costs. The second reason we were able to transition to AI data engineering without incurring massive upfront costs is that we have been a data engineering company for over 20 years and we were able to repurpose a lot of what we already had in place, including management, resources, facilities and technologies, to serve the AI use cases. The third reason is that when we began exploring AI back in 2016 and developing our Goldengate infrastructure we incurred manageable investment. From a data perspective, because we were already employing large teams of resources doing customer work, we did not have to incur incremental additional costs for humans-in-the-loop. We simply had to re-architect our operator workbenches and to create the right data lakes. The objectives we initially set for the models we built were to enable us to reduce costs associated with maintaining rules-based data processing technologies. We were not seeking to automate the work of humans, but to augment it. Over the years, Goldengate, one of our proprietary platforms, became, we believe, state-of-the-art at things like entity extraction, data categorization and document zoning, all important aspects of what we do. We use the technology in customer deployments and within our own platforms yields great results. That said, Goldengate is not ChatGPT, you can’t converse with it or ask it to writing poetry. Goldengate has 50 million parameters, while ChatGPT is reputed to have 1.7 trillion parameters. Nevertheless, Goldengate demonstrates that AI can be trained to perform specific tasks very well without incurring massive spending, that AI deployments leveraging open source algorithms and models can be within reach for many enterprises for industry-specific datasets; and that for business implementations especially, data engineering is more important than sheer model size as a predictor of performance. The question I got recently is, how does revenue per employee compare in your different lines of business? The answer is that revenue per employee is lowest in our managed services business, while it is multiple times higher in our AI data engineering scaled services. Regardless, we target an adjusted gross margin of 35% to 37% across these two business lines and we believe gross margin is the better metric to track. In our software business, our targeted gross margin is anticipated to be about 73% this year and we intend to target a consolidated adjusted gross margin of between 40% and 43%. The final question I’ve gotten several times recently and that I want to respond to on today’s call is, is Agility now profitable? The answer is yes. In this quarter, Agility posted adjusted EBITDA of $1.2 million. This was a 69% sequential increase over Q3. We think we executed the Agility business very well in 2023, growing at 15% in a difficult macro environment. It had a strong adjusted gross margin of 69% over 2023 as a whole and 74% in Q4. We also love what we’ve done with the product, we believe we’ve taken a leadership position as the first end-to-end public relations and media intelligence platform to integrate generative AI. I’ll now turn the call over to Marissa to go through the numbers and then we’ll open the line for some questions. Marissa Espineli: Thank you, Jack. Good afternoon, everyone. Allow me to recap our fourth quarter and fiscal year 2023 results. Revenue for the quarter ended December 31, 2023 was $26.1 million, up 35% from revenue of $19.4 million in the same period last year. The comparative period included $0.5 million in revenue from the large social media company that underwent a significant management change in the second half of last year, as a result of which it dramatically pulled back spending across the Board. There was no revenue from this company in the three months ended December 31, 2023. Net income for the quarter ended December 31, 2023 was $1.7 million or $0.06 per basic share and $0.05 per diluted share, compared to a net loss of $2 million or $0.07 per basic and diluted share, in the same period last year. Total revenue for the year ended December 31, 2023 was $86.8 million, up 10% from revenue of $79 million in 2022. The comparative period included $8.5 million in revenue from the large social media company referenced above. There was no revenue from this company in 2023. Net loss for the year ended December 31, 2023 was $0.9 million or $0.03 per basic and diluted share, compared to net loss of $12 million or $0.44 per basic and diluted share in 2022. Adjusted EBITDA was $4.3 million in the fourth quarter of 2023, compared to adjusted EBITDA of $0.2 million in the same period last year. Adjusted EBITDA was $9.9 million for the year ended December 31, 2023, compared to adjusted EBITDA loss of $3.3 million in 2022. Cash, cash equivalents and short-term investments were $13.8 million at December 31, 2023 and $10.3 million at December 31, 2022. Now, before I turn to answer questions, like Jack, I also have gotten some questions from investors recently that I promised to respond to on today’s call. The first question was about why we keep cash overseas. The reason we keep cash overseas is to cover operating expenses in these locations. We do not plan to repatriate these funds nor we -- nor do we foresee the need to. Further, another question was about cost-plus transfer pricing agreement with our offshore subsidiaries. Companies that have revenue in, say, North America or Europe, but have offshore delivery centers in countries like India and the Philippines, put in place what’s called transfer pricing arrangements, this is to satisfy the arm’s length transaction principle. Under a transfer pricing arrangement, a percentage of revenue is allocated to the delivery center. The percentage allocated is often determined by statute or regulation in the foreign country. We understand that the reason the foreign country does this is to make sure there are profits at the local level for it to tax. However, when the consolidated enterprise is losing money and would not otherwise have to pay taxes, it unfortunately ends up having to pay taxes offshore. Obviously, paying taxes when you are losing money is not a good thing and is referred to as tax leakage, but even in this situation, the tax we pay is insignificant versus the money we save by operating offshore. This business model is very common across many industries and not unique to Innodata. The last question that I’ve gotten is whether is there any structural reason that Innodata would be expected to lose more money as it generates more revenue? The answer to this is absolutely not. As Innodata revenue increases, we expect that its adjusted EBITDA will increase at an even higher percentage. This is because there is some operating leverage in our direct costs, for things like production facilities and other fixed expenses and significant operating leverage in our general and administrative operating costs. We saw clear evidence of this in both Q3 and Q4. Like, in Q3, revenue grew sequentially by $2.5 million and adjusted EBITDA grew sequentially by $1.6 million. Similarly, in Q4, revenue grew sequentially by $3.9 million and adjusted EBITDA grew sequentially by $1.1 million. There will, however, be quarterly fluctuations on how much revenue falls to the EBITDA line based on how we flex our operating expenses, particularly our sales and marketing efforts, based on market dynamics. Well, I hope I was able to address some of our investor queries. Again, thanks everyone. And I will now turn this over to John. John, we are now ready for questions. Operator: Thank you. [Operator Instructions] The first question comes from Tim Clarkson with Van Clemens. Please proceed. Tim Clarkson: Hey, Jack. How are you doing? Jack Abuhoff: Hey, Tim. Doing great. Tim Clarkson: Good. Good. Well, I thought the quarter was outstanding. So just as a question, I’m going to have you answer it, but you’re going to answer it in a more sophisticated way than I’m going to say it. But I mean, when I originally learned about Innodata being involved in AI, Rahul told me, and this is what -- he told me when the stock was at $1, he said, listen, the reason Innodata is going to be successful is they’re the most accurate. And at IBM, the reason we had so much trouble on 80% of our deals was inaccuracy. And so far, you’ve gotten a number of smaller contracts and now you’ve gotten the big contracts. It’s coming true. So to me, that’s maybe a real simple insight for some people who are intimidated by all the complexity of AI. But why don’t you explain in the simplest terms, how Innodata fits into AI? Jack Abuhoff: Sure. Well, in a number of different ways, I think, to -- and I don’t think your question is particularly unsophisticated. I think that exactly what you said is correct. The key to programming large language models is essentially the data engineering that goes into it and the principle of garbage in, garbage out, holds very much true. What I see that we’re doing a great job at is creating very high quality datasets that our customers are able to use and incorporate into large language models to get the performance from the models that they’re seeking. Instruction datasets that are key to helping the models understand prompts, to accept instruction, to converse, to reason, all of these things. And that’s how they’re competing. They’re competing on the quality of the experience that their customers will have with the models that they’re building. So, to the extent that the data engineering that we provide to them is helping them achieve that, well, that obviously is a very, very good thing. Now, on top of data accuracy and data engineering, the thing that we’ve been focused on for so long now. I think we create the appropriate customer experience that they’re looking for. They’re figuring things out. They need a company that’s highly dynamic and that’s agile and that can stay with their engineering team, that can be responsive to the changing requirements that the engineering team has, and again, that’s something that’s firmly built into our culture. So, we’re very proud of the results that we’re showing. We’re very proud of the quality of the partnerships that we’re achieving. I think, well, we announced that for one of the large deployments this quarter, we signed a three-year ongoing contract with a hopeful value of $69 million. It’s a huge achievement and what that came with was a lot of wonderful things that the customer had to say about us, about the value of the data, exactly like you just said, and about the quality of the experience that they have with us. So we think we’re doing good, we’re very well poised for an exciting year next year and we’re very excited about that. Tim Clarkson: Right. Now, looking at your projections, I mean, you said last time you expect some $30 million quarters. It looks like based on what you did in the fourth quarter and in your growth rates, you’re approaching that sometime this year, right? Jack Abuhoff: Well, I think, we’re going to stick with the guidance that we’re providing. Our intention is to surprise and delight our investors. We think we have the opportunity to do that. Tim Clarkson: Right. Jack Abuhoff: So the guidance that we’ve put out there is 20% growth, but with the intention of besting that… Tim Clarkson: Sure. Jack Abuhoff: I think we have a very good chance of being able to do that. Tim Clarkson: Right. Right. Now when I look at the P&L, I know you like to look at EBITDA. I like to look at net after-tax. It seems to me that somewhere as you approach, say, $35 million, at $30 million, you start to net 10% to 15% after-tax and at $35 million, you start to approach more like 15% to 20% after-tax. Is that about right? Jack Abuhoff: Yeah. We are not going to -- there are a lot of things that go into the model. I think that we’re going to resist the temptation of kind of digging in and creating more of a model than we are. The guidance is what we’re saying. I think we intend to do better than that and perhaps… Tim Clarkson: Right. Jack Abuhoff: … significantly and I think the business is not that difficult to model. I’d encourage you to do it. I think we can create a lot of shareholder value this year. Tim Clarkson: Right. And obviously, as sales go up, historically within a data, profitability has always gotten up on balance, not every quarter, but typically it goes up much faster than the revenues. Jack Abuhoff: That’s correct. And I think you see that operating leverage working very strongly in both Q3 and Q4, and that operating leverage and the disproportionate increases that we see in profitability to revenue growth will work for us, will continue to work for us, I believe, and will give us the ability to further invest in the company and stay aligned with our market and ahead of our competitors and we think we’re managing the company appropriately from that perspective. We’re very happy, as we just said, to confirm that we don’t plan on needing to raise equity. We think that that’s a very strong statement for a company that has been able to keep pace with others of our competitors who are more significantly funded than we are and to compete aggressively with them and win deals against them. So we think we’re managing the opportunity appropriately and we think there’s a lot of good things ahead for us. Tim Clarkson: Right. A little softer question. Can you explain, not the big guys, but say a smaller application, you mentioned a drugstore where they might want to use AI as their customer service. Kind of explain what that would look like or a retail shop where they’re using AI rather than necessarily people to get business done? Jack Abuhoff: Sure. Well, I’ll give you a fresh example, not even from the work that we’re doing today, but from the work that I’m hopeful that we’ll be doing at some point in the near future. We’re in conversations with a kind of home furnishings manufacturer who wants to create the ability for someone to upload pictures to their website and to utilizing those pictures to discover which of their furnishing products would fit best within that environment, and maybe even display what that might look like. So I think as you go from enterprise-to-enterprise, firstly, I think, it’s almost inconceivable that there will be enterprises who won’t be affected and likely benefited from these technologies if they seize them correctly. And the fact that as we do the work that we’re doing with the foundation model builders, we’re also continuing to plant seeds in enterprise and to work soup-to-nuts with enterprises to figure out how do they take advantage of these technologies and seize these opportunities is, I think, planting very strong seeds for the future. Tim Clarkson: Right. Okay. I’m done. Thanks. Operator: The next question comes from Dana Buska with Feltl. Please proceed. Dana Buska: Hi, Jack. Jack Abuhoff: Hey, Dana. Dana Buska: Congratulations on an excellent quarter. Jack Abuhoff: Well, thank you so much. We’re very happy with the quarter. We’re happy with how we’re kicking off 2024. Dana Buska: Oh! Wonderful. My first question I have is that, I just want to ask a question about your Goldengate platform. It is my understanding that that’s built on the transformer architecture. And is that like the same architecture that OpenAI uses? And I was just wondering, what does that mean for your offerings? Jack Abuhoff: Sure. So, I believe that it is the same architecture. And when we see that it is, what we mean to use that as a proof point for it is that we’re making good, solid, future-proofed engineering decisions within our engineering department. And I think that’s important, because it’s not trivial to make those decisions and it’s not obvious when you’re making them whether you’re making the right ones. Now, that having been said, we are not by any measure saying that we can use the Goldengate as a substitute for ChatGPT. That’s far from the case. Goldengate is 50 million parameters. We believe ChatGPT is 1.7 billion parameters. Goldengate does very specific things that are good for us and good for our customers in our business. We use it in many, many of our deployments. But you can’t ask it to write a poem about butterflies and iambic pentameter. It just doesn’t work for that. The fact is, though, that, we picked the right technology. We’re using it very effectively in much of what we’re doing. It was very, very useful in the work that we were doing for Big Tech companies in classic AI. It has less utility in large language models, but continues to have lots of utility in our business. Dana Buska: Okay. Wonderful. With the kind of fast-moving marketplace and fine-tuning and reinforcement learning, do you have any estimates about how large that market is right now? Jack Abuhoff: I think there are a lot of different estimates. The one that we’ve shared in the past, I don’t have the data in front of me, but the one that we shared in the past was a Bloomberg estimate looking at AI in large language model-related services and showing that there would be a significant expansion in that market. I’d probably point you to that and be happy to send you a reference for that after the call. Dana Buska: Okay. Okay. Great. That’s excellent. And in the last couple of conference calls, you talked about your white label agreement and I was just wondering, how is that going? Are you seeing any inroads with that? Jack Abuhoff: Yeah. We’re seeing inroads. We still think it’s early days. Again, it’s early days for enterprise applications as a whole. We had a very good quarter with that customer in Q4. I think we’re going to see pickup from the white label partnership beginning in Q1 and probably through the year. But again, I view that very much as a seed for the enter -- that we’ve planted for the enterprise side of the business. Right now, the growth that you’re seeing is primarily on the work that we do -- the data engineering work that we’re doing for the internal builds that the hyperscalers and large tech companies are working on. Dana Buska: Okay. And what strategies are you employing to differentiate yourselves from your competitors? Jack Abuhoff: So, I think, it depends on the line of business. If you think about the services side of business, which is the bulk of the business, it’s 80% of the business. What we need to do is no different than any other services company would need to do. We have to do a very good job at what we’re hired to do. Just like the question Tim asked, he said, well, is the data quality really important? And I think the answer to that is, as I said, it clearly is critical. It’s what we’re being hired to do. Beyond that, you care about the level of service that you’re obtaining. You care about the qualities that the vendor is bringing to the relationship. You’re caring about how tightly aligned they are with your engineering team and whether -- when they zig, you can zag and whether you can follow their lead and be responsive to their changing requirements. We’re bringing that to the table. Dana Buska: Okay. Excellent. And do you have any new products or services that you’re excited to be introducing this year? Jack Abuhoff: Yeah. So, I think, there’s a lot that’s going on. When you look at the field as a whole, what you see and what we’re starting to see is the spread of activities around languages, around domains, around what we call text to X, the different modalities that large language models are going to be requiring to support. And again, I focus on that, because it’s within the growth area of our services that is most important. So we’re doing a lot of work on those areas. We’re also doing a lot of work in terms of trust and safety and aligning our capabilities to their emerging requirements in terms of helping ensure that the models perform as expected. That’s going to be an important area. In other areas of the business, we’re releasing new product capabilities. We’ve got some things coming out in medical data extraction that we’re excited about. We’ve got an AI roadmap that is very compelling and being received now well kind of in beta by customers in the Agility segment. So, we’re excited about that as well. Dana Buska: Do you have any plans to do images with Agility? Jack Abuhoff: I’m sorry, doing images? Dana Buska: Images. Yeah. Jack Abuhoff: So I think that the primary use case of Agility, it’s a media intelligence platform and it’s an end-to-end workflow for PR professionals that require the ability to both target audiences with messages, to craft those messages, to find out who to target best to send those messages to and then to analyze, pick up and to monitor news and social media globally. So there’s not really a huge requirement for images within that product other than what we’ve already integrated. So, for example, we’ve already integrated AI that can be used to monitor news and imagery within the news. So if your logo, for example, is contained in a piece of news, we can inform our customers that that has been observed. Dana Buska: Okay. Great. That does it for me. Thanks for answering my questions. Jack Abuhoff: Thank you. Operator: [Operator Instructions] Up next is Bill Thompson with Kerro Capital [ph]. Please proceed. Unidentified Analyst: Hey. Good afternoon. Jack Abuhoff: Hi, Bill. Good afternoon. Unidentified Analyst: Congrats on the quarter. I was pleasantly surprised to see that the company made a profit based on the recent performance. That’s definitely a nice change. I had a question about the Agility business. So you stated multiple times that the Agility business is actually profitable as it stands now. Is that on a GAAP basis or is that by adjusted EBITDA? Jack Abuhoff: So we -- it is both GAAP and adjusted EBITDA, but we do use adjusted EBITDA as a core metric, because we think that it’s useful. When we’re looking at adjusted EBITDA, we’re carving out, as you may be aware, we’re carving out D&A, stock option expense, obviously income tax, and then one-time severance costs that are not recurring. But it was also profitable on a GAAP basis. Unidentified Analyst: Okay. And you’re sure about that? Jack Abuhoff: Yes. Unidentified Analyst: I’m looking through the announcement and it’s unclear. It’s not usually broken out. I have another question. Jack Abuhoff: We’d be happy to separately take you through that and answer any detailed questions you have. Unidentified Analyst: Okay. That’d be excellent. I have another question. So you had a very experienced CFO two years ago and the person resigned, I believe it was two days before the report was signed and submitted to SEC. So it was pretty abrupt. And then the company put in place an Interim CFO and it’s been two years. The company claimed that they were, well, you at the time, you claimed that you were in the process of looking for a full-time CFO. However, it’s been two years and there’s still an Interim CFO. Can you give us an update on that process of looking for a full-time CFO? Jack Abuhoff: Yeah. So in -- I think it was March of 2021, we hired a SVP of Finance and Corporate Development. And his function and his mandate was to put in place a stronger strategic finance function than we had at the time. We saw that that was an important need that we had. And what that function does is, it looks at how we’re managing cash. It looks at the return that we’re getting on investments that we’re making. It looks at and takes ownership of our budgeting and all of those functions. So it’s kind of strategic day forward, looking forward, providing leadership around how we’re managing the business and the investments that we’re making. We already had very strong talent in terms of the controllership function. What we found with hiring this person and the talent that we have in place is that we’ve got strong talent kind of end-to-end right now in the finance function. I think arguably the piece that we may be lacking and the piece that we need to think through more carefully as it becomes more important is the Investor Relations component, the public company component. Are we spending enough time doing outreach with investors? Unidentified Analyst: I hate to interrupt, but I know you like to editorialize a lot, but are you saying that you currently don’t need a full-time CFO and that the interim is going to continue? Jack Abuhoff: What I’m saying is that as we think about the need for a CFO, we’re doing a lot of thinking about the Investor Relations function and the role of someone who would be working with our analysts who may be thinking about covering our company and things like that. From a perspective of capabilities for what we need today, I think, we’re very, very well covered and we’ve got very strong talent in place. Unidentified Analyst: Okay. And then one last thing. I’m looking at the numbers from the press release and it looks like Agility had a $1.3 million GAAP loss. Can you verify that, either the CFO or yourself, Jack? Jack Abuhoff: So we -- I don’t have the numbers in front of me right now, but we had a GAAP profit, and again, I’d be very happy offline to put you in touch with... Unidentified Analyst: I mean, it’s kind of a big deal. You guys just finished the quarter. You should know the GAAP profitability of your business segments? Do you guys have a straight answer for that? Jack Abuhoff: So, well, I think that -- I’m not sure exactly what you’re trying to get me to say. I told you that... Unidentified Analyst: I just want to know how -- I’m investing in the company. I would like to know how much money the company’s making. It’s pretty straightforward. Jack Abuhoff: So, we had $440,000 of GAAP profit in Agility in the quarter. Unidentified Analyst: Because I’m seeing a net loss of $1.35, again, for the year. Jack Abuhoff: I’d be very happy to have a call with you to drill down to that and look at what you’re looking at and how that differs from what we’re reporting. I don’t know how I can help you beyond that. Unidentified Analyst: All right. I appreciate it. Operator: We have reached the end of the question-and-answer session. I will now turn the call over to Jack for closing remarks. Jack Abuhoff: Thank you. In 2023, the world witnessed a seismic shift with the arrival of OpenAI’s ChatGPT. It sealed the spotlight. It wasn’t just another software release. It was a phenomenon. It captivated the world with its abilities to do what seemed like superhuman feats. And this sparked a wave of development with companies vying to push the boundaries of language generation and its applications. We saw that there were tech giants locked in a heated race to dominate the realm of generative AI models and this arm race resulted in billions of dollars of ongoing investment that being made by these companies with ripple effects potentially reshaping every industry we know. It’s essential to underscore, and I think, a couple of these questions were useful in that regard, that in the realm of training large language models, the age-old adage of garbage in, garbage out holds particularly true. This is where our distinct advantage comes to play, as we’ve been consistently delivering high-quality data at scale for 30 years. One of our competitive advantages lies in providing unparalleled data quality, which serves as the foundation for successful AI implementations. Moreover, our success is bolstered by the entrepreneurial and collaborative culture that we’ve cultivated over the decades, engaging with large corporations across diverse industries. This empowering culture has enabled us to compete with other businesses at a remarkably high success rate, driving our continued growth and our achievements. We saw a business pick up momentum through the year as we began to seize the generative AI opportunity and we met or exceeded expectations on all fronts. Revenue growth, adjusted EBITDA growth and key customer acquisition. In Q4, same thing. We beat both top and bottomline guidance and we entered three-year, $23 million per year deal with a key Big Tech customer for the program we kicked off mid last year, a testament clearly to how highly they valued our collaboration. We’re off to an exciting start to 2024. As you know now, we have -- we’re now engaged with five of the seven for generative AI development and we’re seeing the benefits of this engagement in our results. In 2024, we will be working to drive expansion in all these accounts and to land others. We’re guiding to a 20% growth in 2024, but our ambition is to exceed that. My team and I are energized by what we’ve accomplished in 2023 and we’re excited about what we will accomplish in 2024. So thank you all for joining the call today. We look forward to our call. Operator: This concludes today’s conference and you may disconnect your lines at this time. Thank you for your participation.
INOD Ratings Summary
INOD Quant Ranking
Related Analysis