Search photos on Unsplash based on OpenAI's CLIP model, support search with joint image+text queries and attention visualization.
-
Updated
Sep 9, 2021 - Jupyter Notebook
Search photos on Unsplash based on OpenAI's CLIP model, support search with joint image+text queries and attention visualization.
Cobweb is a multi-modal journey planner offering a server based REST API and a light frontend.
A vector database for querying meaningfully similar data.
Cherrry Javascript SDK
CSE508-Information Retrieval course project on Multi modal search using deep learning.
FrameFinderLE is an advanced image and video frame retrieval system that enhances CLIP's image-text pairing with hashtag refinement and user feedback, offering an intuitive search experience.
Add a description, image, and links to the multi-modal-search topic page so that developers can more easily learn about it.
To associate your repository with the multi-modal-search topic, visit your repo's landing page and select "manage topics."