This service is part of the RedBorder Incident Response. Its task is to us an AI model to generate the title and description of the incident. The model used is /~https://github.com/Mozilla-Ocho/llamafile. It's used from the redborder-webui via API.
Plataforms
- Rocky Linux 9
- Install the redborder repo following the steps described in https://packages.redborder.com/
- yum install redborder-ai
For specific details use:
rb_ai.sh --help
Simple usage:
rb_ai.sh -p "your prompt goes here"
This model is also executed as a service in port 50505.
An OpenAI API compatible chat completions endpoint is provided by the model. It's designed to support the most common OpenAI API use cases. OpenAI Documentation.
Example:
The following example will generate the title of an incident using the signatures passed.
curl http://<ip>:50505/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer no-key" \
-d '{
"model": "LLaMA_CPP",
"messages": [
{
"role": "system",
"content": "You are RedBorderAI, an AI assistant that is expert in web request and alerts. Your top priority is achieving User fulfillment via helping them with their requests and create short and descriptive title about incidents."
},
{
"role": "user",
"content": "Explain me this snort rules:\nSERVER WEBAPP TP-Ling Archer Router command injection attempt\nsmtp: Attempted command buffer overflow\n"
}
]
}'
- Fork the repository on Github
- Create a named feature branch (like add_component_x)
- Write your change
- Write tests for your change (if applicable)
- Run the tests, ensuring they all pass
- Submit a Pull Request using Github
- Pablo Pérez González pperez@redborder.com