Functions with Streaming
Hopfield makes it easy to use streaming with function calling. You define validation-driven functions which get passed to the LLM.
Usage
Use streaming function calling like:
ts
ts
importz from "zod";importhop from "hopfield";importopenai from "hopfield/openai";importOpenAI from "openai";consthopfield =hop .client (openai ).provider (newOpenAI ());constweatherFunction =hopfield .function ({name : "getCurrentWeather",description : "Get the current weather in a given location",parameters :z .object ({location :z .string ().describe ("The city and state, e.g. San Francisco, CA"),unit :z .enum (["celsius", "fahrenheit"]).describe (hopfield .template ().enum ("The unit for the temperature.")),}),});constchat =hopfield .chat ().streaming ().functions ([weatherFunction ]);constmessages :hop .inferMessageInput <typeofchat >[] = [{role : "user",content : "What's the weather in San Jose?",},];constresponse = awaitchat .get ({messages ,},{onChunk (chunk ) {console .log (`Received chunk type: ${chunk .choices [0].__type }`);// do something on the server with each individual chunk as it is// streamed in},onDone (chunks ) {console .log (`Total chunks received: ${chunks .length }`);// do something on the server when the chat completion is done// this can be caching the response, storing in a database, etc.//// `chunks` is an array of all the streamed responses, so you// can access the raw content and combine how you'd like},asynconFunctionCall (fn ) {// do something based on the function call result - this// is parsed by your function definition with zod, and// the arguments are coerced into the object shape you expectawaittakeAction (fn .name ,fn .arguments );},});
ts
importz from "zod";importhop from "hopfield";importopenai from "hopfield/openai";importOpenAI from "openai";consthopfield =hop .client (openai ).provider (newOpenAI ());constweatherFunction =hopfield .function ({name : "getCurrentWeather",description : "Get the current weather in a given location",parameters :z .object ({location :z .string ().describe ("The city and state, e.g. San Francisco, CA"),unit :z .enum (["celsius", "fahrenheit"]).describe (hopfield .template ().enum ("The unit for the temperature.")),}),});constchat =hopfield .chat ().streaming ().functions ([weatherFunction ]);constmessages :hop .inferMessageInput <typeofchat >[] = [{role : "user",content : "What's the weather in San Jose?",},];constresponse = awaitchat .get ({messages ,},{onChunk (chunk ) {console .log (`Received chunk type: ${chunk .choices [0].__type }`);// do something on the server with each individual chunk as it is// streamed in},onDone (chunks ) {console .log (`Total chunks received: ${chunks .length }`);// do something on the server when the chat completion is done// this can be caching the response, storing in a database, etc.//// `chunks` is an array of all the streamed responses, so you// can access the raw content and combine how you'd like},asynconFunctionCall (fn ) {// do something based on the function call result - this// is parsed by your function definition with zod, and// the arguments are coerced into the object shape you expectawaittakeAction (fn .name ,fn .arguments );},});
Feedback
To influence these features, reach out on Discord or Github Discussions. We want your feedback!