Skip to content

Instantly share code, notes, and snippets.

@zainhas
Created January 24, 2025 00:32
Show Gist options
  • Select an option

  • Save zainhas/e15120eb7f9dcbdbeaf7575d7e6fe8c8 to your computer and use it in GitHub Desktop.

Select an option

Save zainhas/e15120eb7f9dcbdbeaf7575d7e6fe8c8 to your computer and use it in GitHub Desktop.
Extract ONLY thinking tokens from DeepSeek-R1
from together import Together
client = Together(api_key = TOGETHER_API_KEY)
question = "Which is larger 9.9 or 9.11?"
thought = client.chat.completions.create(
model="deepseek-ai/DeepSeek-R1",
messages=[{"role": "user", "content": "Which is larger 9.9 or 9.11?"}],
stop = ['</think>']
)
PROMPT_TEMPLATE = """
Thought process: {thinking_tokens} </think>
Question: {question}
Answer:
"""
answer = client.chat.completions.create(
model="meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo",
messages=[{"role": "user",
"content": PROMPT_TEMPLATE.format(thinking_tokens=thought.choices[0].message.content, question = question) }],
)
print(answer.choices[0].message.content)
#Answers: 9.9 is larger than 9.11.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment