Skip to content

Watch the video I made of my first short look HERE

Mistral has updated their LLM chat interface called Le Chat, which I understand is French for The Cat.

Anyway, it is a chat interface to their generative artificial intelligence model, much like OpenAI's ChatGPT, and Anthropic's Claude. You can read more about the new capabilities of Le Chat HERE.

All you need is a free account (for now). I do a quick first test of Le Chat to look at how it produces simple Python code. Le Chat does not allow for the upload of a spreadsheet file, so instead, I tried to use it to solve a simple system of two linear equations in two unknown.

I do stress test the chat interface a bit by using LaTeX in my prompts and I also get it to use the symbolic Python package, sympy. A package that I absolutely love, but is not that commonly used in the broader context of Python use-cases.

I copy and paste the code into Visual Studio Code (after having set up a virtual Python environment in which I installed numpy, matplotlib, and sympy beforehand).

Le Chat did a good job in this small first test. It generated the two lines in the plane using matplotlib to show the unique solution (at the intersection of the lines). It generated the augmented matrix as I instructed, but then solved the system of linear equations using the solve method in sympy. After instructing Le Chat to rather calculate the reduced row-echelon form of the matrix using the sympy rref method, it did indeed that.

Check out the new Le Chat for yourself HERE or watch the short video I made of my first test below.

An extension to ChatGPT Desktop App for MacOS

The GitHub Copilot extension in Visual Studio Code (VSC) provides a great pair-programming experience. This is especially true when using Jupyter notebooks and Python or R code blocks to generate short scripts (in code cells). This form of coding is great for research and teaching tasks.

The GitHub Copilot extension now also allows for the choice between OpenAI and Anthropic models, with Google models to follow soon. The GitHub Copilot extension opens in the right sidebar and allows us to chat with our favorite large language model (LLM). The image of the right sidebar in VSC below shows the GPT 4o model as the currently selected LLM.

The ChatGPT Desktop App for MacOS can now also integrate with VSC (and other programming interfaces such as the Terminal XCode. This negates the need to go back and forth between the ChatGPT Desktop App and the coding environment.

The integration requires the installation of an extension which can be downloaded from the OpenAI instruction page HERE. Once the required file is downloaded it must be installed as an extension in VSC. To do this, simply hold down the command and the shift keys and hit the P key. This opens the Command Palette in VSC. Start typing Extensions: Install from VSIX… and make command selection. Navigate to the downloaded file and install it.

When opening the ChatGPT Desktop App on a Mac, a new selection button will appear at the bottom of the page.

Connect to IDE
Connect to IDE

Clicking on this button will allows for the selection of code editors or terminal apps to integrate into your chat.

Add IDE
Add IDE

Now the ChatGPT Desktop App will be aware of the code being written in VSC, making for a better experience.