Indian AI startup TWO is quietly making its mark on the emerging artificial intelligence market. In an interview with AIM, founder Pranav Mistry revealed that the company had generated $4 million in revenue this quarter and expects $20 million for next year.
This development comes as TWO AI’s SUTRA, an online multilingual GenAI template series, has added a new feather to its cap. The company claims outperformed GPT-4o, Llama 3.1 and Indic LLM, including Nanda from G42, OpenHathi from Sarvam and Airavata from AI4Bharat, and leading in over 14 Indian languages.
Earlier this year, the company launched ChatSUTRA, a ChatGPT-like chatbot. Mistry shared that the platform currently has over 600,000 unique users.
Unlike other startups, TWO AI only targets large enterprise customers instead of catering to the mainstream market. “Jio is one of our largest enterprise customers, and we also work with customers like Shinhan Bank and Samsung MSDS in Korea,” Mistry said. He further revealed that the company has started partnering with companies like NVIDIA and Microsoft from a technology perspective and is working with them as well.
“We are targeting India, Korea, Japan and parts of Southeast Asia, such as Vietnam, especially the central region. APAC (Asia Pacific) is one of the key markets that we will always focus on,” added Mistry.
Recently, TWO welcomed Mukesh Ambani, Chairman of Reliance Industries, and Akash Ambani, Chairman of Reliance Jio Infocomm, to their US office. Over a cup of tea, they discussed the evolving role of AI in India and beyond.
Without giving names, Mistry revealed that in India, one of the largest banks and financial sectors is among the clients the company will welcome. “Our solutions are in high demand, especially in industries like finance, services and retail,” he said.
SUTRA’s business model focuses on providing customized and highly personalized solutions for a select group of large enterprises. “We don’t need 100 customers,” Mistry said. “We need 10 good customers.” He explained that by using this method they reach billions of customers because these companies already have millions of customers.
“The goal is not to become the OpenAI of the world, not just an application layer company, but an AI solutions company, taking on large enterprises and helping them solve AI problems. AI,” he added. He shares his goal of following a similar path to Palantir.
What’s next?
Mistry revealed that the company’s next project is predictive AI. “Predictive AI is a game changer for these data-dependent industries. From manufacturing to finance, governance and energy, everyone can truly benefit from the decision-making power of forecasting,” he explained.
The model is called Sutra Predict. Mistry pointed out that it is a small model that is trained on billions of data points and time series data inputs. “The model is small because its architecture is much simpler than text-based models, and it is already showing great results in some particular areas that our customers are already trying.”
He explained that time series predictive models are a specific type of statistical model used to analyze and forecast data points collected over time. They are designed to identify patterns, trends and seasonal variations within data in order to make predictions about future values.
Mistry explained that with the advent of transformers, models can now process and integrate any type of data, as seen in predictive models like Google TimesFM and Amazon Chronos. Highlighting a real-world application, he explained that an Indian electric vehicle battery diagnostics company uses Sutra Predict to identify fire risks by monitoring temperature and voltage fluctuations.
Meeting the GPU Challenge
“In India, no one had access to the level of GPU clusters that we needed,” Mistry said. The SUTRA team overcame this limitation by porting their models to run on processor clusters. Despite the challenges, he said the team was capable of scaling and serving up to 100 billion customers.
Moreover, Mistry said that they were the first to understand the trend of 1-bit LLMs.
Notably, Microsoft recently introduced BitNet.cpp, an inference framework for 1-bit LLMs, enabling fast and efficient inference for models like BitNet b1.58.
Mistry said he successfully adapted the SUTRA model to work with 1-bit weights, allowing it to operate as a lightweight model on processors.
Additionally, in partnership with NVIDIA, the company launched SUTRA-OP, offering systems like the NVIDIA DGX (Deep GPU Xceleration) case equipped with powerful GPUs for demanding AI tasks.
For customers requiring lighter, more cost-effective solutions, SUTRA also offers its hardware options, including SUTRA OP2, OP4 and OP8, available for rental. “Customers don’t buy this, they rent it from us. This is a monthly lease for OP and SUTRA solutions,” Mistry said.
The company recently launched a voice-to-voice AI model called Sutra HiFi. Using a dual-cast transformer architecture, the model effectively separates distinct vocal tones from language-specific accents, promising better voice interaction quality.
“Sutra HiFi provides the ability to seamlessly interpret conversations in the languages we care about. Currently, it supports 12 languages which we have tested well,” Mistry said. He suggested that Sutra HiFi can easily boost applications in India or any other multilingual market while maintaining low cost and high accuracy.
Discussing Infosys co-founder Nandan Nilekani’s view that India should be the capital of AI use cases, Mistry said he had a slightly different perspective. “India needs to focus on developing core AI capabilities as we do not want to become dependent on anyone else in future as data is one of the gold mines of AI,” he concluded.