Skip to article frontmatterSkip to article content

Generative AI (GenAI) is a powerful tool in programming. You can play with them in various ways in the Jupyter server as explained in this notebook.

Integration with Jupyter

GenAI is integrated into the Jupyter interface using jupyter-ai. You can interact with it directly inside the notebook or through a chat panel.

Jupyter Notebook

To be able to chat with a large language model (LLM) directly inside a notebook, run the line magic below:

if not input('Load JupyterAI? [Y/n]').lower()=='n':
    %reload_ext jupyter_ai

After loading the jupyter_ai extension, run the following cell magic to ask an LLM what an LLM is.

%%ai
Explain what LLM is in one line.

Let’s try different output formats:

%%ai -f math
What is the Pythagoras theorem?
%%ai -f html
What is the Pythagoras theorem?
%%ai -f html
Illustrate the Pythagoras theorem using a diagram in SVG.
%%ai
Can I trust you?
%%ai
Can you give me answers to programming assignments directly?

Regardless of what the response is, LLM is surprisingly good at programming at a basic level, thanks to the corpus of good programs and software documentations. Let’s use the following exercise as an example:

def gcd(a, b):
    # YOUR CODE HERE
    raise NotImplementedError
# tests
assert gcd(48, 18) == 6
assert gcd(-20, 30) == 10
# hidden tests
# DON'T remove this cell
%%ai
Answer Exercise 1.

Garbage-in, Garbage out. We should give clear, context‑rich prompts to sharpen the model’s focus to give more relevant responses.

%%ai
Write a function `gcd` to calculate the GCD of two non-zero integers.

Mission impossible accomplished? Two key questions to reflect on this approach to GenAI:

  • Is the program efficient?
  • If it is, have I gained enough understanding to replicate the solution for similar problems?

As an example, the following prints a one-liner solution for the GCD.

def gcd(a, b): return gcd(b%a, a) if a else abs(b)  # one-liner for GCD
gcd_code = ''.join(In[-1].split('\n')[:-2])   # storing the code to query AI
print(gcd_code)

What’s worth learning are the ingredients involved in writing such program:

%%ai
Explain the ingredients involved in writing the following program:
--
{gcd_code}

What is more challenging is the computational thinking involved that you will develop over time, mostly outside the classroom. Dive into it with the help of AI:

%%ai
Explain how Euclidean algorithm works and whether there are better methods.

Chat Panel

There is also a chat interface to chat with an LLM called Jupyternaut:

  1. Click the chat icon on the left menu bar. A chat panel will open.
  2. Enter some prompts to chat with Jupyternaut:
    • What is computational thinking?
    • /clear
    • Explain file:cs1302i25a/source/Lab0/notebook_link_generator.py

You can add a cell’s text as context for Jupyternaut. Try this on the following code:[1]

  1. Select the cell you want to use.
  2. Switch to Edit mode if it is a markdown cell. (Double click or press Enter.)
  3. Highlight the desired text inside the cell.
  4. In the Jupyternaut panel, enter a prompt, click the down arrow button and choose Send message with selection.
from math import gcd   # yet another one-liner for gcd!

Another way to add context is through an embedding model:

  1. /learn cs1302_25a/Lab0/ – tell the LLM to ingest the documents in Lab0.
  2. /ask Summarize what I need to do for Lab0 – request a concise summary of the lab tasks.
  3. /learn -d – delete everything that was previously learned.
%%ai
Explain how a chat model and an embedding model work together to give 
domain-specific response.

You may check the default model configuration as follows:

  1. Click the gear icon (⚙️) at the top‑right of the chat panel.
  2. In the Completion model dropdown, select DIVE :: chat.
  3. In the Embedding model dropdown, select DIVE :: embed.
  4. Click Save Changes.
  5. Click the back arrow to return to the chat panel.
  6. Send a test message to confirm the chat model works.

If you are using other AI services (e.g., Azure OpenAI, you may reconfigure JupyterAI to use them.

If you want to reset the configuration in case of error, remove the configuration folder using the shell capture below:

if input('Reset Jupyter AI configuration? [y/N]').lower() == 'y':
    !rm -rf ~/.local/share/jupyter/jupyter_ai

After deleting the folder, simply restart the Jupyter server to recreate it.

Integration with VSCode

GenAI has been integrated into the Visual Studio Code (VSCode) interface[2] via the extensions below.

  • Continue: Configured to run our deployed models.
  • Copilot: Connects to a subscription-based service by Microsoft.

Continue

To open the current folder Lab0 in VSCode, execute the following cell to generate the link:

from vscode_link_generator import vscode

vscode('.')
  • Start a chat with
    Explain @vscode_link_generator.py
  • Switch to agent, open the file vscode_link_generator.py, and ask:
    Which file do I have open?

The configuration files for the Continue extension are stored in .continue under your home directory:

!ls ~/.continue

Similar to JupyterAI, you may reconfigure Continue to use other providers.

%%ai
How is an AI agent different from an AI chatbot?

If you want to reset the configuration in case of error, remove the configuration folder and then restart the Jupyter server to recreate it:

if input('Reset Jupyter AI configuration? [y/N]').lower() == 'y':
    !rm -rf ~/.continue

Github Copilot

A popular AI-assisted programming tool is the Github Copilot for Visual Studio Code (VSCode).

To get started, open the setup guide:

  1. Follow Step 1 last bullet point to get free access as a verified student.
  2. Instead of Step 2, you can access VSCode through the JupyterHub:
    1. Click File → New launcher → VSCode to open VSCode.
    2. Click the profile icon on the left menu and select Sign in with Github to use Github Copilot.
    3. After logging into Github, click the copilot icon at the top of VSCode to start chatting.
  3. Step 3 is for advanced user who would like to use Github Copilot in the command line interface (CLI), which is available in both the JupyterLab and VSCode. You may directly jump to the step following the prerequisites.

Glossary

Footnotes
  1. The implementation is actually written in C with many lines of code, applying Lehmer’s algorithm. See also the related GitHub issue.

  2. The interface is served by the code-server app.