shijiefengjun chenhaodong Kevin Hu commited on
Commit
9ef0b16
·
1 Parent(s): 587bed3

Fix the value issue of anthropic (#3351)

Browse files

### What problem does this PR solve?

This pull request fixes the issue mentioned in
https://github.com/infiniflow/ragflow/issues/3263.

1. response should be parsed as dict, prevent the following code from
failing to take values:
ans = response["content"][0]["text"]
2. API Model ```claude-instant-1.2``` has retired (by
[model-deprecations](https://docs.anthropic.com/en/docs/resources/model-deprecations)),
it will trigger errors in the code, so I deleted it from the
conf/llm_factories.json file and updated the latest API Model
```claude-3-5-sonnet-20241022```



### Type of change

- [x] Bug Fix (non-breaking change which fixes an issue)

---------

Co-authored-by: chenhaodong <[email protected]>
Co-authored-by: Kevin Hu <[email protected]>

Files changed (2) hide show
  1. conf/llm_factories.json +2 -2
  2. rag/llm/chat_model.py +1 -1
conf/llm_factories.json CHANGED
@@ -2371,8 +2371,8 @@
2371
  "model_type": "chat"
2372
  },
2373
  {
2374
- "llm_name": "claude-instant-1.2",
2375
- "tags": "LLM,CHAT,100k",
2376
  "max_tokens": 102400,
2377
  "model_type": "chat"
2378
  }
 
2371
  "model_type": "chat"
2372
  },
2373
  {
2374
+ "llm_name": "claude-3-5-sonnet-20241022",
2375
+ "tags": "LLM,CHAT,200k",
2376
  "max_tokens": 102400,
2377
  "model_type": "chat"
2378
  }
rag/llm/chat_model.py CHANGED
@@ -1260,7 +1260,7 @@ class AnthropicChat(Base):
1260
  system=self.system,
1261
  stream=False,
1262
  **gen_conf,
1263
- ).json()
1264
  ans = response["content"][0]["text"]
1265
  if response["stop_reason"] == "max_tokens":
1266
  ans += (
 
1260
  system=self.system,
1261
  stream=False,
1262
  **gen_conf,
1263
+ ).to_dict()
1264
  ans = response["content"][0]["text"]
1265
  if response["stop_reason"] == "max_tokens":
1266
  ans += (