Deploying Deep Learning models for production is a costly affair. Compressing models could save millions of dollars in GPU compute! At ElementalsAI we are focused on developing better & better techniques to compress Deep Learning models.
Self-Supervised, Semi-supervised & Weakly Supervised Learning
Availability of large, annotated datasets for a specialized task is rarely a case in the industry. Companies often have large amounts of raw data. Leveraging semi-supervised learning which requires only a small set of data or weakly supervised learning which can train models with noisy labels or self-supervised techniques which are great at learning representations can be game changers in creating accurate deep learning models. At ElementalsAI we constantly leverage these techniques to solve data related problems for our customers.
Graph Neural Networks
Another segment of unstructured data is Graphs. What makes building machine learning models on graph data challenging is that the structure of the graphs changes with the nodes and edges. Representing Graphs in a way that we can build Convolutional Models is an exciting scenario. At ElementalsAI, we are exploring how we can best exploit this method solve several problems.
Probabilistic Graphical Models
Probabilistic Models like Markov Models & Bayesian Networks are an underrated set of machine learning algorithms. We’ve utilized these algorithms in combination with Deep Learning to derive better results in Deep Learning based products.
Federated Learning enables decentralized training of Deep Learning models on multiple remote nodes and essentially enabling training on edge devices. This has several benefits including preserving the privacy of user data, providing offline predictive services, and companies to economically scale their machine learning solution.
Let us Leapfrog together with AI
We would love to connect with you to better understand your business goals and evolve a state-of-the-art solution with our research oriented and futuristic engineering team.