Automatically populate Google NotebookLM with learning resources from any roadmap.sh roadmap.
The pipeline extracts every topic from a roadmap, finds the community-curated YouTube videos and articles linked to each one, creates a dedicated NotebookLM notebook per topic, and uploads all the sources into it. Once your notebooks are loaded, a small CLI lets you fuzzy-search through them and fire off study material generation (podcasts, flashcards, quizzes, etc.) in bulk — without copy-pasting prompts everywhere.
Notebook creation and source ingestion is automated using notebooklm-py, a community-built Python package for the NotebookLM API.
Defaults to the Backend roadmap. To use it with any other roadmap.sh roadmap, see the Configuration section.
roadmap.sh GitHub repo
│
▼
get_resources.py ← scrapes topics + resource links → <roadmap>_resources.json
│
▼
create_notebooks.py ← creates one NotebookLM notebook per topic, uploads all sources
│
▼
generate_study_material_for_notebook.py ← interactive CLI to generate study materials per notebook
1. Clone the repo
git clone https://github.com/WaseemAldemeri/roadmap-to-notebooklm.git
cd roadmap-to-notebooklm2. Create a virtual environment and install dependencies
python3 -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt3. Log in to NotebookLM
A one-time Google login that saves a local session, reused by all scripts:
notebooklm loginA browser window will open — sign in with your Google account. You won't need to do this again unless the session expires.
python get_resources.pyFetches the roadmap JSON from GitHub, finds every topic node, downloads each topic's markdown file, and extracts all linked YouTube videos and articles. Outputs a <roadmap>_resources.json file (e.g. backend_resources.json for the default backend roadmap).
python create_notebooks.pyReads the resources file, creates one NotebookLM notebook per topic, and uploads a context markdown file plus all resource URLs into each notebook. There is a 5-second pause between notebooks to avoid hitting rate limits. This step takes a while — let it run.
NotebookLM limits how much you can generate at once, so this is intentionally done on demand rather than in bulk upfront.
python generate_study_material_for_notebook.pyAn interactive CLI will appear:
- Fuzzy-search through all your notebooks — start typing a topic name to filter.
- Select which materials to generate — podcast, slide deck, flashcards, quiz, and/or video overview. All are selected by default; toggle with Space.
- Hit Enter and the generation jobs are dispatched to Google's servers. They render in the background inside NotebookLM.
Everything roadmap-specific lives in config.py. To use a different roadmap, update the values at the top of that file:
ROADMAP_NAME = "frontend"
ROADMAP_DISPLAY_NAME = "Frontend"
ROADMAP_URL = "https://roadmap.sh/frontend"
ROADMAP_JSON_URL = "https://v-raw-githubusercontent-com.adclosenn.dev/kamranahmedse/developer-roadmap/master/src/data/roadmaps/frontend/frontend.json"
GITHUB_API_CONTENT_URL = "https://v-api-github-com.adclosenn.dev/repos/kamranahmedse/developer-roadmap/contents/src/data/roadmaps/frontend/content"The output filenames and folder names (<roadmap>_resources.json, <DisplayName>_Context_Files/) are derived automatically — no other files need changing.
The URL pattern is consistent across all roadmap.sh roadmaps: replace backend with the roadmap slug (the part after roadmap.sh/ in the URL) in both GitHub URLs.
The STUDY_MATERIALS dict at the bottom of config.py contains the prompts sent to NotebookLM for each material type. Adjust the instructions strings there to change the tone, audience level, or focus of the generated content.
config.py — all configurable values (roadmap URLs, prompts)
get_resources.py — Stage 1: extract topics and resource links
create_notebooks.py — Stage 2: create NotebookLM notebooks and upload sources
generate_study_material_for_notebook.py — Stage 3: interactive study material generator
<roadmap>_resources.json — generated by Stage 1, consumed by Stage 2
<DisplayName>_Context_Files/ — markdown context files uploaded alongside each notebook