On sci-fi TV shows like Star Trek, when a doctor needs to cure a disease, they often ask the computer to simulate complex visualizations or go through databases for potential cures. In real life, setting up complex simulations, especially in drug discovery, isn’t nearly as easy: it can involve hours or even days worth of coding and effort to get AI models working on a simulation problem or to identify a potential new therapy for experimentation.
Utah-based Recursion, which has developed multiple AI and machine learning models to develop drugs currently in clinical trials, aims to change that. At the J.P. Morgan Healthcare Conference Monday morning, the company announced its software platform Lowe, which uses a large-language learning model to serve as a natural language interface, enabling scientists to ask questions of all the company’s models in tandem for complex drug discovery tasks without having to code complex simulations on their own.
“We’ve got more than 20 different tools we’ve built at Recursion,” Recursion cofounder and CEO Chris Gibson told Forbes. “And it’s a little bit challenging to become an expert in how to use all of them. The LLM is a vehicle by which we can give our scientists access to them.”
A demo of the technology started with a simple query similar to what a person might enter into ChatGPT. In this case, it was a query for a list of potential genetic targets in lung cancer. The program then queried a specific Recursion database and provided those targets. Before providing results, the software also provided its interpretation of the question –a way for the user to confirm it was doing what it was asked.
Over the course of the next 20 minutes, the program identified relationships between genetic targets, identified other potential targets, identified known molecules that might have potential therapeutic value, ordered those compounds for laboratory testing, generated new potential treatment compounds and even designed and set up experiments to test them. And at each step of the process, it provided data, visualizations and documentation–all of which could be used later to support regulatory applications, Gibson noted.
The demo was conducted by Daniel Cohen, president of Recursion subsidiary Valence Labs, who highlighted that the large language model itself wasn’t providing any of the answers–it had simply been trained to use all of Recursion’s other models, which did the actual computational work, and then share the results in a way a human could understand. In this way, he said, “we’re not at risk of getting hallucinations,” referring to LLMs’ tendency to make up a false, but plausible sounding, answer to an inquiry.
Gibson says that Lowe will primarily be used for the company’s 500+ employees, and it does not plan to offer it as a product. That said, Gibson noted that Recursion “already has a lot of interest from our close partners” in the tool and the company may explore ways to give them access. Recursion may also explore offering Lowe up for academic researchers down the line, though that version of the tool would likely use publicly available datasets and models rather than Recursion’s own.
In terms of future development, Gibson says the company is also looking for ways to enable its software to work on early stage drug discovery with less human supervision. “Can Lowe, or a tool like it, be asking and answering questions itself?” For example, he says, a program might be asked to find potential drugs for a disease, with the program itself looking for and evaluating potential targets “across the genome,” and then providing drug candidates and designing potential experiments for human approval, all on its own. In the meantime, Cohen says that using this tool will enable the company’s scientists to utilize new datasets and machine learning models much more quickly by just training Lowe on how to use them.
“We’re making sure we’re putting the state of the art in the hands of Recursion scientists as quickly as possible without all the intermediate engineering steps we’d otherwise have to deploy,” he said.