Social Media

Light
Dark

Clika is building a platform to make AI models run faster

Ben Asaf spent several years establishing the development infrastructure at Mobileye, the autonomous driving startup acquired by Intel in 2017. Simultaneously, he worked on methods to expedite the training of AI models at Hebrew University.

As an expert in MLOps (machine learning operations), focusing on tools that streamline the process of taking AI models to production and managing them, Asaf was inspired to create a company that could remove significant obstacles for software engineers and organizations seeking to deploy AI models in real-world settings.

“When the idea of starting a company first crossed my mind, very few companies and individuals had practical experience in implementing MLOps into their AI development pipelines,” Asaf explained in an email interview with TechCrunch. “I believed we could make AI more compact, making it lighter, faster, and more cost-effective for commercialization.”

In 2021, Asaf joined forces with Nayul Kim, his wife, who had been working as a digital transformation consultant for enterprises. Together, they co-founded Clika, one of the startups competing in the Startup Battlefield 200 competition at TechCrunch Disrupt. Clika offers a toolkit that enables companies to automatically reduce the size of their internally developed AI models, thereby reducing the computational resources they consume and accelerating their inference speed.

“With Clika, you can effortlessly connect your pre-trained AI models and receive an automatically compressed model that is fully compatible with your target device—whether it’s a server, the cloud, the edge, or an embedded device,” Asaf elaborated.

To achieve this, Clika employs techniques like quantization, which reduces the number of bits required to represent information in a model. While sacrificing some precision, quantization shrinks the model without compromising its ability to perform specific tasks, such as identifying different dog breeds. Additionally, Clika generates a report outlining potential improvements or adjustments to enhance model performance.

As the AI industry grapples with supply chain issues related to hardware for running these models, interest in making models more efficient is on the rise. Microsoft recently cautioned in an earnings report about potential service disruptions in Azure due to AI hardware shortages. Meanwhile, Nvidia’s high-performing AI chips in the H100 GPU series are reportedly sold out until 2024.

Clika is not the only startup pursuing AI model compression solutions. Competitors include Deci, backed by Intel; OctoML, which, like Clika, automatically optimizes and packages models for various hardware platforms; and CoCoPie, a startup focused on optimizing AI models specifically for edge devices.

However, Asaf contends that Clika possesses a technological edge. “While other solutions rely on rule-based compression techniques, Clika’s compression engine takes an AI-driven approach by understanding the unique structures of different AI models and applying the most suitable compression method for each,” he noted. “We have the world’s leading compression toolkit for vision AI, surpassing the performance of existing solutions developed by Meta and Nvidia.”

While claiming to be the “world’s best” is a bold assertion, Clika has successfully attracted investors, raising $1.1 million in a pre-seed round the previous year with the participation of Kimsiga Lab, Dodam Ventures, D-Camp, and angel investor Lee Sanghee.

Asaf refrained from disclosing customer progress at this stage, as Clika is presently pre-revenue and conducting a closed beta for a select group of businesses. Nonetheless, he mentioned that Clika intends to seek seed funding in the near future.

Leave a Reply

Your email address will not be published. Required fields are marked *