Course Timeline
JS & Foundations
Comp. Thinking
Probability
Machine Learning
MODULE 01
JavaScript & Programming Foundations
Weeks 1–4
WK 01
Hello, World — What is Programming?
The big picture: computers, code, and creativity
Topics Covered
- How computers execute instructions step-by-step
- What JavaScript is and where it runs (browser & Node)
- Variables, values, and the console
- Data types: strings, numbers, booleans
- Writing your first program
Lab: console.log("Hello, World!")
Learning Outcomes
- Set up a development environment
- Declare and use variables in JS
- Understand what a runtime error is
Foundation
WK 02
Making Decisions & Repeating Actions
Conditionals, loops, and program flow
Topics Covered
- if / else if / else statements
- Comparison and logical operators
- for loops and while loops
- Loop control: break, continue
- Flowcharts as a visual planning tool
Activity: FizzBuzz in JS
CT Connection
- Conditionals = decision nodes in an algorithm
- Loops = recognising repeated patterns
- Draw flowchart before writing code
Algorithm Design
WK 03
Functions — Building Reusable Blocks
Abstraction through functions and scope
Topics Covered
- Function declarations vs. arrow functions
- Parameters, arguments, and return values
- Variable scope (local vs. global)
- Pure functions and side effects
- Calling functions inside other functions
Lab: Build a unit converter
CT Connection
- Functions embody abstraction
- Naming a function = hiding complexity
- Reusable code = pattern recognition in action
Abstraction
WK 04
Arrays, Objects & Working with Data
Storing and manipulating collections
Topics Covered
- Arrays: creation, indexing, iteration
- Array methods: map, filter, reduce
- Objects: keys, values, nesting
- JSON as a data format
- Destructuring and spread operator
Project 1 due: Data analyser
Learning Outcomes
- Model real-world entities as JS objects
- Transform data using functional array methods
- Read and write JSON
🏁 Project 1
MODULE 02
Computational Thinking
Weeks 5–7
WK 05
Decomposition & Pattern Recognition
Breaking problems apart and spotting structure
Topics Covered
- The four pillars of computational thinking
- Decomposing complex problems into sub-tasks
- Identifying repeating patterns in problems and data
- Modelling problems as input → process → output
- Top-down vs. bottom-up design
Activity: Decompose making a sandwich
Topics Covered
- Apply decomposition to split a program into functions
- Recognise where loops replace repeated code
- Map a real problem to pseudocode steps
Decomposition
WK 06
Abstraction & Algorithm Design
Hiding complexity; designing step-by-step solutions
Topics Covered
- What abstraction means: hiding vs. ignoring detail
- Pseudocode and structured English
- Sorting algorithms: bubble sort, insertion sort
- Searching algorithms: linear and binary search
- Big-O intuition (fast vs. slow algorithms)
Activity: Sort a deck of cards by hand, then in JS
CT Connection
- A sorting algorithm is the clearest example of precise algorithm design
- Binary search = divide and conquer
- Trace through code like a computer would
Algorithm Design
WK 07
CT in Practice — Rock, Paper, Scissors
End-to-end: from problem to working JS game
Topics Covered
- Apply all four CT pillars to a single project
- Mapping game logic to objects and functions
- Randomness: Math.random() and uniform distributions
- Event-driven programming: buttons, clicks
- DOM manipulation basics
Project 2: Build Rock–Paper–Scissors
Learning Outcomes
- Translate a real-world problem into code using CT
- Handle user interaction with event listeners
- Use Math.random() to simulate chance
🏁 Project 2
MODULE 03
Probability & Data Thinking
Weeks 8–10
WK 08
Randomness, Chance & Simulations
From coin flips to Monte Carlo methods
Topics Covered
- What is probability? Events, outcomes, sample space
- Uniform vs. weighted random distributions
- Law of large numbers: simulating many coin flips
- Monte Carlo estimation (e.g., estimating π)
- Seeding randomness; reproducibility
Lab: Simulate 10,000 dice rolls in JS
Learning Outcomes
- Use loops to run probabilistic simulations
- Understand why more trials → better estimates
- Build a histogram from simulation data
Simulation
WK 09
Descriptive Statistics & Distributions
Mean, median, variance, and the normal curve
Topics Covered
- Mean, median, mode — when to use each
- Variance and standard deviation (intuition-first)
- Normal distribution: the bell curve
- Skew, outliers, and when averages lie
- Visualising distributions in the browser with Canvas
Lab: Visualise a dataset with a live histogram
Learning Outcomes
- Compute statistics from an array using JS
- Draw a bar chart on an HTML canvas
- Identify when a dataset is normally distributed
Statistics
WK 10
Bayes' Theorem & Conditional Probability
Updating beliefs with new evidence
Topics Covered
- Conditional probability: P(A|B)
- Bayes' theorem explained visually
- Prior, likelihood, and posterior
- Naive Bayes as a gateway to ML classifiers
- Spam filter walkthrough (word probabilities)
Activity: Build a tiny text classifier in JS
Bridge to ML
- This week bridges probability directly into ML
- Classifiers assign probabilities to class labels
- Softmax (used in neural nets) is a probability function
🌉 Bridge Week
MODULE 04
Introduction to Machine Learning
Weeks 11–14
WK 11
What is Machine Learning?
Learning from data instead of writing rules
Topics Covered
- Supervised vs. unsupervised learning
- Training data, labels, and features
- The training → evaluation loop
- Overfitting vs. underfitting (intuition)
- Real-world ML applications: images, text, games
Demo: Teachable Machine in the browser
CT Revisited
- ML is decomposition applied to pattern finding
- Features = abstraction (we choose what matters)
- Training loop = algorithm iterating toward a solution
ML Concepts
WK 12
Neural Networks & How They Learn
Neurons, weights, and gradient descent
Topics Covered
- Biological vs. artificial neurons
- Weighted sum + bias + activation function
- ReLU, sigmoid, softmax — when and why
- Forward pass: how a prediction is made
- Loss function: measuring how wrong we are
- Backpropagation: nudging weights in the right direction
Lab: Implement a single neuron in JS from scratch
Probability Link
- Softmax converts raw scores to a probability distribution
- Cross-entropy loss = comparing two probability distributions
- Confidence % = the model's probability estimate
Neural Nets
WK 13
MNIST Digit Recogniser with TensorFlow.js
Train a real model in the browser
Topics Covered
- The MNIST dataset: 70,000 handwritten digits
- TensorFlow.js API: tensors, layers, compile, fit
- Image pre-processing: normalising pixel values to [0,1]
- Training epochs, batch size, and accuracy tracking
- Prediction: argmax and confidence scores
Lab: Draw digits and test your trained model
End-to-end Pipeline
- Pixels → float array → tensor → model.predict()
- 28×28 canvas → 784-number vector
- 10 output scores → softmax → probability per digit
Project 3 Prep
WK 14
Final Project & Reflection
Build something that uses code, data, and learning
Final Project Options
- Option A: Extend the digit recogniser with confidence visualisation
- Option B: Build a Naive Bayes text classifier for movie reviews
- Option C: Create an interactive probability simulation dashboard
- Option D: Train a custom image classifier using Teachable Machine + TF.js
Presentations in final class
Reflection Topics
- How did CT help you plan your project?
- Where did probability appear in your solution?
- What would you teach the model next?
- Ethics: bias in datasets and model fairness
🏆 Final Project