Hasty, a provider of annotation tools for vision AI practitioners, raised $3.7 million in seed funding.
The seed funding round was led by Shasta Ventures with participation from coparion and iRobot Ventures. The company plans to use the latest funds to accelerate product development and expand its customer base across Europe and North America.
Computer vision (CV), or vision AI, enables computers to gain a high-level understanding from digital images or videos and has the potential to change how we perform daily tasks – from diagnosing diseases to tracking packages through the supply chain to analyzing advanced materials worldwide.
Current approaches to data labeling are too slow. They don’t train and update the model while labeling, so 80% of a data scientist’s time is spent finding, cleaning, and organizing the “ground truth” data they use to train their neural networks.
Therefore, it comes as no surprise that more than half of the vision-based AI projects never make it into production. Machine learning engineers often have to wait three to six months for the first results to see if their annotation strategy and approach are working because of the delay between labeling and model training.
Hasty’s next-generation annotation tool labels data in one-tenth of the time through self-learning AI-assistants and agile machine learning that provides rapid feedback so engineers can validate and adapt models as they work.
“There are over 750,000 machine learning practitioners today who are working on vision AI topics and spending the majority of their time managing data rather than building and tuning neural networks,” said Tristan Rouillard, co-founder and CEO of Hasty.
“This represents a $30 billion waste that we aim to tackle head-on. That is why we have created a next-generation annotation tool and community that reduces data prep and management time by 70%. We put our customers ahead of the game and significantly increase their ability to bring more successful vision AI projects to market, faster.”
Image Credit: Hasty