Creatotech+

Helping nonexperts build advanced generative AI models

The impact of artificial intelligence will never be equitable if there’s only one company that builds and controls the models (not to mention the data that go into them). Unfortunately, today’s AI models are made up of billions of parameters that must be trained and tuned to maximize performance for each use case, putting the most powerful AI models out of reach for most people and companies.

MosaicML started with a mission to make those models more accessible. The company, which counts Jonathan Frankle PhD ’23 and MIT Associate Professor Michael Carbin as co-founders, developed a platform that let users train, improve, and monitor open-source models using their own data. The company also built its own open-source models using graphical processing units (GPUs) from Nvidia.

The approach made deep learning, a nascent field when MosaicML first began, accessible to far more organizations as excitement around generative AI and large language models (LLMs) exploded following the release of Chat GPT-3.5. It also made MosaicML a powerful complementary tool for data management companies that were also committed to helping organizations make use of their data without giving it to AI companies.

Last year, that reasoning led to the acquisition of MosaicML by Databricks, a global data storage, analytics, and AI company that works with some of the largest organizations in the world. Since the acquisition, the combined companies have released one of the highest performing open-source, general-purpose LLMs yet built. Known as DBRX, this model has set new benchmarks in tasks like reading comprehension, general knowledge questions, and logic puzzles.

Since then, DBRX has gained a reputation for being one of the fastest open-source LLMs available and has proven especially useful at large enterprises.

More than the model, though, Frankle says DBRX is significant because it was built using Databricks tools, meaning any of the company’s customers can achieve similar performance with their own models, which will accelerate the impact of generative AI.

“Honestly, it’s just exciting to see the community doing cool things with it,” Frankle says. “For me as a scientist, that’s the best part. It’s not the model, it’s all the amazing stuff the community is doing on top of it. That's where the magic happens.”

Making algorithms efficient

Frankle earned bachelor’s and master’s degrees in computer science at Princeton University before coming to MIT to pursue his PhD in 2016. Early on at MIT, he wasn't sure what area of computing he wanted to study. His eventual choice would change the course of his life.

Frankle ultimately decided to focus on a form of artificial intelligence known as deep learning. At the time, deep learning and artificial intelligence did not inspire the same broad excitement as they do today. Deep learning was a decades-old area of study that had yet to bear much fruit.

“I don’t think anyone at the time anticipated deep learning was going to blow up in the way that it did,” Frankle says. “People in the know thought it was a really neat area and there were a lot of unsolved problems, but phrases like large language model (LLM) and generative AI weren’t really used at that time. It was early days.”

Things began to get interesting with the 2017 release of a now-infamous paper by Google researchers, in which they showed a new deep-learning architecture known as the transformer was surprisingly effective as language translation and held promise across a number of other applications, including content generation.

In 2020, eventual Mosaic co-founder and tech executive Naveen Rao emailed Frankle and Carbin out of the blue. Rao had read a paper the two had co-authored, in which the researchers showed a way to shrink deep-learning models without sacrificing performance. Rao pitched the pair on starting a company. They were joined by Hanlin Tang, who had worked with Rao on a previous AI startup that had been acquired by Intel.

The founders started by reading up on different techniques used to speed up the training of AI models, eventually combining several of them to show they could train a model to perform image classification four times faster than what had been achieved before.

“The trick was that there was no trick,” Frankle says. “I think we had to make 17 different changes to how we trained the model in order to figure that out. It was just a little bit here and a little bit there, but it turns out that was enough to get incredible speed-ups. That’s really been the story of Mosaic.”

The team showed their techniques could make models more efficient, and they released an open-source large language model in 2023 along with an open-source library of their methods. They also developed visualization tools to let developers map out different experimental options for training and running models.

MIT’s E14 Fund invested in Mosaic’s Series A funding round, and Frankle says E14’s team offered helpful guidance early on. Mosaic’s progress enabled a new class of companies to train their own generative AI models.

“There was a democratization and an open-source angle to Mosaic’s mission,” Frankle says. “That’s something that has always been very close to my heart. Ever since I was a PhD student and had no GPUs because I wasn’t in a machine learning lab and all my friends had GPUs. I still feel that way. Why can’t we all participate? Why can’t we all get to do this stuff and get to do science?”

Open sourcing innovation

Databricks had also been working to give its customers access to AI models. The company finalized its acquisition of MosaicML in 2023 for a reported $1.3 billion.

“At Databricks, we saw a founding team of academics just like us,” Frankle says. “We also saw a team of scientists who understand technology. Databricks has the data, we have the machine learning. You can't do one without the other, and vice versa. It just ended up being a really good match.”

In March, Databricks released DBRX, which gave the open-source community and enterprises building their own LLMs capabilities that were previously limited to closed models.

“The thing that DBRX showed is you can build the best open-source LLM in the world with Databricks,” Frankle says. “If you’re an enterprise, the sky’s the limit today.”

Frankle says Databricks’ team has been encouraged by using DBRX internally across a wide variety of tasks.

“It’s already great, and with a little fine-tuning it’s better than the closed models,” he says. “You’re not going be better than GPT for everything. That’s not how this works. But nobody wants to solve every problem. Everybody wants to solve one problem. And we can customize this model to make it really great for specific scenarios.”

As Databricks continues pushing the frontiers of AI, and as competitors continue to invest huge sums into AI more broadly, Frankle hopes the industry comes to see open source as the best path forward.

“I’m a believer in science and I’m a believer in progress and I’m excited that we’re doing such exciting science as a field right now,” Frankle says. “I’m also a believer in openness, and I hope that everybody else embraces openness the way we have. That's how we got here, through good science and good sharing.”

*Earn $1,000 Cash*

Here’s How It Works:

1. Refer a Friend: Share the name and contact details of anyone who might benefit from our innovative web solutions.

2. They Sign a Deal: When your referral becomes a client and completes a deal with us, you get rewarded.

3. Get Paid: Receive $1,000 in cash as a thank you for your referral!

It’s that simple. Help your friends get the best in web design and development, and earn big while doing it. Start referring today and watch your rewards grow!

*Subject to IRS income tax rules and regulations

    Referree Details:

    Your Details: