Skip to content

Wrap Interactions

To initiate and wrap a dialogue between a user and LLM-tutor locally, follow this tutorial.

Module for launching the Textual chat interface with an LM Studio-backed language model. It should be run from the command line.

This class configures the model, system prompt, and save path, and then starts an interactive, full-screen terminal chat application using the textual framework. The interface supports live message streaming, styled user and assistant blocks, and automatic logging of chat history to JSON and CSV formats.

Parameters:

Name Type Description Default
api_url str

URL of the LM Studio API endpoint used for inference. Defaults to "http://127.0.0.1:1234/v1/chat/completions".

'http://127.0.0.1:1234/v1/chat/completions'
model_name str

Name of the model to use in LM Studio (e.g., "llama-3.2-3b-instruct").

'llama-3.2-3b-instruct'
temperature float

Sampling temperature for text generation. Controls randomness. Defaults to 0.7.

0.7
system_prompt str

Initial system message to prime the assistant’s behavior. Defaults to a helpful tutoring message.

'You are a helpful tutor guiding a student. Answer short and concisely.'
save_dir Path

Directory for saving logged conversations as .json and .csv. Defaults to "data/logged_dialogue_data".

Path('data/logged_dialogue_data')

Methods:

Name Description
run

Launches the Textual chat application and starts interaction with the model.

Source code in src/educhateval/chat_ui.py
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
class ChatWrap:
    """
    Module for launching the Textual chat interface with an LM Studio-backed language model. It should be run from the command line.

    This class configures the model, system prompt, and save path, and then starts an interactive, full-screen terminal chat
    application using the `textual` framework. The interface supports live message streaming, styled user and assistant blocks,
    and automatic logging of chat history to JSON and CSV formats.

    Parameters:
        api_url (str): URL of the LM Studio API endpoint used for inference. Defaults to "http://127.0.0.1:1234/v1/chat/completions".
        model_name (str): Name of the model to use in LM Studio (e.g., "llama-3.2-3b-instruct").
        temperature (float): Sampling temperature for text generation. Controls randomness. Defaults to 0.7.
        system_prompt (str): Initial system message to prime the assistant’s behavior. Defaults to a helpful tutoring message.
        save_dir (Path): Directory for saving logged conversations as `.json` and `.csv`. Defaults to "data/logged_dialogue_data".

    Methods:
        run(): Launches the Textual chat application and starts interaction with the model.


    """

    def __init__(
        self,
        api_url: str = "http://127.0.0.1:1234/v1/chat/completions",
        model_name: str = "llama-3.2-3b-instruct",
        temperature: float = 0.7,
        system_prompt: str = "You are a helpful tutor guiding a student. Answer short and concisely.",
        save_dir: Path = Path("data/logged_dialogue_data"),
    ):
        self.api_url = api_url
        self.model_name = model_name
        self.temperature = temperature
        self.system_prompt = system_prompt
        self.save_dir = save_dir

        # Initialize model and conversation history
        self.model = ChatLMStudio(
            api_url=self.api_url,
            model_name=self.model_name,
            temperature=self.temperature,
        )
        self.chat_history = ChatHistory(
            messages=[ChatMessage(role="system", content=self.system_prompt)]
        )

    def run(self):
        """Launches the Textual app."""
        app = ChatApp(
            model=self.model,
            chat_history=self.chat_history,
            chat_messages_dir=self.save_dir,
        )
        app.run()

run()

Launches the Textual app.

Source code in src/educhateval/chat_ui.py
307
308
309
310
311
312
313
314
def run(self):
    """Launches the Textual app."""
    app = ChatApp(
        model=self.model,
        chat_history=self.chat_history,
        chat_messages_dir=self.save_dir,
    )
    app.run()

Example Usage from Terminal:

chat-ui \
  --api_url http://127.0.0.1:1234/v1/chat/completions \
  --model llama-3.2-3b-instruct \
  --prompt "You are a helpful tutor guiding a student." \
  --save_dir data/logged_dialogue_data