Accurately predicting drug-target interactions (DTI), or the effect a drug has on a biological entity (like proteins, enzymes, and receptors) plays a major role in discovering new drugs and therapeutics. With many potential compounds in the body, computational methods are invaluable in reducing the pool of candidates for experimental validation. To address this need, Talo and Bozdag (2025) propose a fusion and graph model. Since proteins and drugs have unique structures, the model first embeds both features into vectors using specialized LLMs (i.e. for proteins, a model able to embed protein structure syntaxes into a vector). Then, they use topological embeddings to capture the shape of the embeddings on an image, creating a graph. To determine whether the drug and protein will interact, the sequential and topological embeddings are combined into one feature vector, and a graph neural network (GNN) is trained with an MLP.
To expand upon the authors’ work, we propose recreating the original model and comparing results with our custom model. First, since the authors use Jupyter notebooks and PyTorch, we seek to create an equivalent base model using TensorFlow in Python files. However, we also would like to explore beyond the authors’ use of a GNN to train their model. Although more complex, Graph Transformers offer a potential improvement over GNNs since they allow all nodes to attend to each other with attention, which could incorporate additional information, such as interactions between similar drug and target nodes. Based on the results of training on the datasets used in the paper (BioSnap and Human), we will compare the model’s val/test performances to see if the advanced architecture outperforms the original model. Additionally, we may plan to test beyond these databases with alternative datasets like BindingDB, which contains interactions of proteins considered as candidate drug targets, or DrugBank, a database with drug target information.
Built With
- tensorflow
Log in or sign up for Devpost to join the conversation.