-
Notifications
You must be signed in to change notification settings - Fork 17.3k
langchain-openai: support custom fallback content for tool-only messages #30803
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
- Some APIs (e.g., Gemini) require non-empty string content in all messages. - This adds support for specifying custom fallback content when a message contains only tool calls.
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Tool calling is supported for Gemini models via the langchain-google-genai
and langchain-google-vertexai
packages, depending on how you are authenticating. See docs here. Will these work for your use-case?
Many providers offer a chat-completions API such that you can use the OpenAI SDK to interact with them. But they often feature small differences like this. If the dedicated Google packages will work for you, that's preferable to attempting to support more providers via ChatOpenAI.
I gave it a try, and the issue seems to be present in both Since I frequently switch between models, and most providers support the OpenAI SDK, I prefer sticking with that—it works out of the box and avoids these edge cases. I get that keeping things simple is important, and adding parameters everywhere isn’t ideal. Maybe when the provider is detected to be Google, we could just make sure the content field is never left empty? It's not a huge issue anyway—I ran into it using the built-in LangGraph React Agent, and I worked around it by adjusting the messages using a "prompt" pre-hook. |
Thanks @louisgthier. Are you able to provide a minimal reproducible example? This would help us figure out the best solution. |
from langchain_core.tools import tool
from langchain_core.messages import HumanMessage, SystemMessage
from langchain_google_genai import ChatGoogleGenerativeAI
from langgraph.prebuilt import create_react_agent
from langgraph.checkpoint.memory import MemorySaver
from typing import List, Any
import os
# 1. Define a simple tool
def add_numbers(a: int, b: int) -> int:
"""Add two numbers and return the result."""
return a + b
add_numbers = tool(add_numbers)
# 2. Set up the Gemini model
model_name = "gemini-2.0-flash"
api_key = os.getenv("GEMINI_API_KEY")
llm = ChatGoogleGenerativeAI(
model=model_name,
google_api_key=api_key,
)
llm = llm.bind_tools([add_numbers])
# 3. Define a minimal state class
class State(dict):
messages: List[Any]
remaining_steps: int
# 4. Create the agent
graph = create_react_agent(
llm,
[add_numbers],
state_schema=State,
checkpointer=MemorySaver()
)
def main():
# 5. Prepare the initial state
system = SystemMessage(content="You are a helpful assistant. Use tools if needed.")
user = HumanMessage(content="Please use the add_numbers tool to add 2 and 3.")
state = State(messages=[system, user], remaining_steps=5) # 5 is arbitrary, just needs to be present
# 6. Run the agent
result = graph.invoke(state, config={"configurable": {"thread_id": "test-thread"}})
print("Final messages:")
for msg in result['messages']:
print(f"{type(msg).__name__}: {getattr(msg, 'content', msg)}")
if __name__ == "__main__":
main() When running this simple script, I get:
|
Please note I have the same result with ChatOpenAI |
Hi @louisgthier, Thanks very much for the minimal example. I've simplified it a bit further using documented usage patterns, and it appears to run: from langchain_core.messages import HumanMessage, SystemMessage
from langchain_google_genai import ChatGoogleGenerativeAI
from langgraph.prebuilt import create_react_agent
from langgraph.checkpoint.memory import MemorySaver
def add_numbers(a: int, b: int) -> int:
"""Add two numbers and return the result."""
return a + b
llm = ChatGoogleGenerativeAI(model="gemini-2.0-flash")
graph = create_react_agent(
llm,
[add_numbers],
checkpointer=MemorySaver(),
prompt="You are a helpful assistant. Use tools if needed.",
)
config = {"configurable": {"thread_id": "abc123"}}
for step in graph.stream(
{"messages": [HumanMessage(content="Please use the add_numbers tool to add 2 and 3.")]},
config,
stream_mode="values",
):
step["messages"][-1].pretty_print()
|
Some APIs (notably Gemini, when accessed via the OpenAI-compatible endpoint) require every message to have a non-null, non-empty string as
content
. However, messages that contain only tool calls (e.g.,ToolMessage
) often havecontent=None
, leading to a400 Bad Request
with:This PR adds support for specifying a fallback string (e.g.
" "
or"N/A"
) to populate the content field of tool-only messages, ensuring compatibility with stricter API implementations.Issue:
N/A — discovered while integrating LangChain with Gemini API.
Dependencies:
None