Hugging Face
huggingface.co › TreeNumber › claude-3.5-sonnet
TreeNumber/claude-3.5-sonnet · Hugging Face
We’re on a journey to advance and democratize artificial intelligence through open source and open science.
Open source coding model matches with sonnet 3.5
Amazing results for a 32B model. Time to try it out. Even if you don't run it locally, it will cost about half as much as the Haiku and should be a lot better. More on reddit.com
Anthropic just released their latest model, Claude 3.5 Sonnet. Beats Opus and GPT-4o
Subreddit to discuss about Llama, the large language model created by Meta AI · Create your account and connect with a world of communities More on reddit.com
Introducing computer use, a new Claude 3.5 Sonnet, and Claude 3.5 Haiku
Beware not to confuse Claude 3.5 Sonnet with Claude 3.5 Sonnet (new)! How come it seems that the "further" an AI company gets the worse they get at naming models More on reddit.com
Any Open Source LLMs you use that rival Claude Sonnet 3.5 in terms of coding?
Mistral coder does alright More on reddit.com
Videos
27:05
STUNNING RESULTS: Torture Testing Claude Sonnet 3.5 - YouTube
01:12:10
Claude 3.5 Sonnet Just Changed the AI Writing Game - YouTube
10:51
Claude Sonnet 4.5 vs GPT 5 – The Loser Is Painfully Obvious 🚨 ...
07:28
Hello Claude Sonnet 4.5! This thing is a BEAST! - YouTube
30:50
Claude Sonnet 4.5 Is INSANE – Real-Time Coding, UI, and Software ...
Anthropic
anthropic.com › news › claude-3-5-sonnet
Introducing Claude 3.5 Sonnet
Claude 3.5 Sonnet is our strongest vision model yet, surpassing Claude 3 Opus on standard vision benchmarks. These step-change improvements are most noticeable for tasks that require visual reasoning, like interpreting charts and graphs.
Reddit
reddit.com › r/claudeai › open source coding model matches with sonnet 3.5
r/ClaudeAI on Reddit: Open source coding model matches with sonnet 3.5
November 11, 2024 - Generation speed will be much lower than sonnet/haiku. ... Minimum 24GB for running it well. But you could run it on less by partially loading it to RAM (lmstudio has a slider for this) - it will be very but 32B quantized is not that large, so it might be very usable on for example 16GB cards. It is also free on huggingface chat by the way.
Zilliz
zilliz.com › tutorials › rag › langchain-and-milvus-and-anthropic-claude-3.5-sonnet-and-huggingface-all-mpnet-base-v2
Build RAG Chatbot with LangChain, Milvus, Anthropic Claude 3.5 Sonnet, and HuggingFace all-mpnet-base-v2
Anthropic Claude 3.5 Sonnet: This advanced model in the Claude 3 family is designed for nuanced understanding and creative language generation. With enhanced prompt comprehension and contextual awareness, it excels in complex dialogue, creative writing, and sophisticated content creation.
Hugging Face
huggingface.co › blog › rcaulk › phi-3-mini-4k-instruct-graph
Outperforming Claude 3.5 Sonnet with Phi-3-mini-4k for graph entity relationship extraction tasks
August 19, 2024 - We fine-tuned Phi-3-mini-4k to exceed Claude Sonnet 3.5 for graph extraction quality by up to 20% and to reduce cost by orders of magnitude. Further, we improve upon the already impressive JSON output structure of Phi-3-mini-4k, reducing parsing error rate from 2.5% to 0. We also release two additional versions, Phi-3-medium-4k-instruct-graph and Phi-3-medium-128k-instruct-graph, aimed at high reasoning and longer contexts. We also setup a HuggingFace space hosting our fine-tuned model, which is designed to ingest any text and visualize the output as a graph:
Reddit
reddit.com › r › LocalLLaMA › comments › 1dkctue › anthropic_just_released_their_latest_model_claude
Anthropic just released their latest model, Claude 3.5 ...
November 21, 2023 - Subreddit to discuss about Llama, the large language model created by Meta AI · Create your account and connect with a world of communities
YouTube
youtube.com › watch
Building a RAG Pipeline with Anthropic Claude Sonnet 3.5 - YouTube
In this video, we explore and test the coding capabilities of Claude Sonnet 3.5, Anthropic's latest model. We begin by providing a diagram of a RAG (Retrieva...
Published June 22, 2024
Hugging Face
huggingface.co › codelion
codelion (Asankhaya Sharma)
Try it now: from adaptive_classifier import AdaptiveClassifier # Load with ONNX automatically enabled (quantized for best performance) classifier = AdaptiveClassifier.load("adaptive-classifier/llm-router") # Add examples dynamically classifier.add_examples( ["Route this to GPT-4", "Simple task for GPT-3.5"], ["strong", "weak"] ) # Predict with optimized inference predictions = classifier.predict("Complex reasoning task") Check out our LLM Router model to see it in action: https://huggingface.co/adaptive-classifier/llm-router GitHub Repository: https://github.com/codelion/adaptive-classifier Install now: pip install adaptive-classifier We'd love to hear your feedback and see what you build with it!