from langchain.chains import LLMChain
from langchain_core.prompts import PromptTemplate
from langchain_community.llms.fake import FakeListLLM
llm = FakeListLLM(responses=["hello"])
prompt = PromptTemplate(input_variables=["input"], template="{input}")
chain = LLMChain(llm=llm, prompt=prompt)
chain.save("/tmp/test_chain.yaml") # Works on 0.3.27, fails on 0.3.28
Traceback (most recent call last):
File "<string>", line 9, in <module>
File "langchain/chains/base.py", line 785, in save
raise NotImplementedError(msg)
NotImplementedError: Chain verbose=False prompt=PromptTemplate(input_variables=['input'], input_types={}, partial_variables={}, template='{input}') llm=FakeListLLM(responses=['hello']) output_parser=StrOutputParser() llm_kwargs={} does not support saving.
> langchain_core: 0.3.83
> langchain: 0.3.28
> langchain_community: 0.3.31
> Python Version: 3.10.17
> OS: Darwin
Checked other resources
Package (Required)
Related Issues / PRs
#33035
Reproduction Steps / Example Code (Python)
Error Message and Stack Trace
Description
model_dumpinChain#33035 changedChain.save()fromself.dict()toself.model_dump().dict()injects_typevia_chain_type;model_dump()inherits from Pydantic BaseModel and never includes it.save()now always hits"_type" not in chain_dictand raisesNotImplementedError.System Info