Skip to content

Commit

Permalink
new version 4.1.8
Browse files Browse the repository at this point in the history
  • Loading branch information
mgonzs13 committed Jan 7, 2025
1 parent 0ee364f commit 9bf2fce
Show file tree
Hide file tree
Showing 8 changed files with 9 additions and 9 deletions.
2 changes: 1 addition & 1 deletion llama_bringup/package.xml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
<?xml-model href="http://download.ros.org/schema/package_format3.xsd" schematypens="http://www.w3.org/2001/XMLSchema"?>
<package format="3">
<name>llama_bringup</name>
<version>4.1.7</version>
<version>4.1.8</version>
<description>Bringup package for llama_ros</description>
<maintainer email="mgons@unileon.es">Miguel Ángel González Santamarta</maintainer>
<license>MIT</license>
Expand Down
2 changes: 1 addition & 1 deletion llama_cli/package.xml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
<?xml-model href="http://download.ros.org/schema/package_format3.xsd" schematypens="http://www.w3.org/2001/XMLSchema"?>
<package format="3">
<name>llama_cli</name>
<version>4.1.7</version>
<version>4.1.8</version>
<description>Cli package for llama_ros</description>
<maintainer email="mgons@unileon.es">Miguel Ángel González Santamarta</maintainer>
<license>MIT</license>
Expand Down
2 changes: 1 addition & 1 deletion llama_cli/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@

setup(
name="llama_cli",
version="4.1.7",
version="4.1.8",
packages=find_packages(exclude=["test"]),
zip_safe=True,
author="Miguel Ángel González Santamarta",
Expand Down
2 changes: 1 addition & 1 deletion llama_cpp_vendor/package.xml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
<?xml-model href="http://download.ros.org/schema/package_format3.xsd" schematypens="http://www.w3.org/2001/XMLSchema"?>
<package format="3">
<name>llama_cpp_vendor</name>
<version>4.1.7</version>
<version>4.1.8</version>
<description>Vendor package for llama.cpp.</description>
<maintainer email="mgons@unileon.es">Miguel Ángel González Santamarta</maintainer>
<license>MIT</license>
Expand Down
2 changes: 1 addition & 1 deletion llama_demos/package.xml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
<?xml-model href="http://download.ros.org/schema/package_format3.xsd" schematypens="http://www.w3.org/2001/XMLSchema"?>
<package format="3">
<name>llama_demos</name>
<version>4.1.7</version>
<version>4.1.8</version>
<description>Demos for llama_ros</description>
<maintainer email="mgons@unileon.es">Miguel Ángel González Santamarta</maintainer>
<license>MIT</license>
Expand Down
2 changes: 1 addition & 1 deletion llama_msgs/package.xml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
<?xml-model href="http://download.ros.org/schema/package_format3.xsd" schematypens="http://www.w3.org/2001/XMLSchema"?>
<package format="3">
<name>llama_msgs</name>
<version>4.1.7</version>
<version>4.1.8</version>
<description>Msgs for llama_ros</description>
<maintainer email="mgons@unileon.es">Miguel Ángel González Santamarta</maintainer>
<license>MIT</license>
Expand Down
2 changes: 1 addition & 1 deletion llama_ros/package.xml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
<?xml-model href="http://download.ros.org/schema/package_format3.xsd" schematypens="http://www.w3.org/2001/XMLSchema"?>
<package format="3">
<name>llama_ros</name>
<version>4.1.7</version>
<version>4.1.8</version>
<description>llama.cpp for ROS 2</description>
<maintainer email="mgons@unileon.es">Miguel Ángel González Santamarta</maintainer>
<license>MIT</license>
Expand Down
4 changes: 2 additions & 2 deletions llama_ros/src/llama_ros/llama.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -817,13 +817,13 @@ Llama::find_stop(std::vector<struct CompletionOutput> completion_result_list,

// respect the maximum number of tokens
if (this->n_past > this->params.n_predict && this->params.n_predict != -1) {
LLAMA_LOG_INFO("Maximun number of tokens reached %d",
LLAMA_LOG_INFO("Maximum number of tokens reached %d",
this->params.n_predict);
return FULL_STOP;
}

if (this->n_past > this->get_n_ctx() && this->params.n_predict == -2) {
LLAMA_LOG_INFO("Maximun number of tokens reached %d", this->get_n_ctx());
LLAMA_LOG_INFO("Maximum number of tokens reached %d", this->get_n_ctx());
return FULL_STOP;
}

Expand Down

0 comments on commit 9bf2fce

Please sign in to comment.