Post
1435
π§¬βοΈπ¬ Encoding the World's Medical Knowledge into 970K! We're excited to release this new series of vector embeddings models for medical literature based on our recent BERT Hash work.
And you read it right, we're talking 970,000 parameters for a surprisingly strong performing model. Enjoy!
https://huggingface.co/blog/neuml/biomedbert-hash-nano
And you read it right, we're talking 970,000 parameters for a surprisingly strong performing model. Enjoy!
https://huggingface.co/blog/neuml/biomedbert-hash-nano