server : free llama_batch on exit (#7212)

* [server] Cleanup a memory leak on exit

There are a couple memory leaks on exit of the server. This hides others.
After cleaning this up, you can see leaks on slots. But that is another
patch to be sent after this.

* make tab into spaces
This commit is contained in:
Steve Grubb 2024-05-11 04:13:02 -04:00 committed by GitHub
parent f99e1e456e
commit 988631335a
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -673,6 +673,8 @@ struct server_context {
llama_free_model(model);
model = nullptr;
}
llama_batch_free(batch);
}
bool load_model(const gpt_params & params_) {