Labs and Projects
The lab content of this course begins with some exercises designed for
students to gain some familiarity with programming embedded systems at
various levels of abstraction with varying amounts of infrastructure
support. It then progresses to team projects. Teams of three to four
students will be expected to develop a plan of action including a time line
and defined division of responsibilities, and then execute a project that
culminates in a demo session and a poster presentation. Presentations will
be held on Monday, May 12, and the project reports are due
Friday, May 16.
Lab Times: Mondays 4-7 PM, Thursday 5-8 PM in 125 Cory.
Each lab will be graded according to the following rubric:
Pre-Lab - 10% - Should be done before lab, handed in when you get to lab.
Lab Checkout - 45% - You have one week to complete the lab and show the TA.
Lab Writeup - 45% - Writeups are due exactly one week after lab.
Late pre-labs may be handed in during lab for half credit, no pre-labs will be accepted after lab session.
Late Checkout and Lab writeups will be deducted 10% for each day late. Late checkout does not extend your writeup deadline. The writeup is due a week after the scheduled lab date for each lab. You must notify me at least 4 days in advance for any emergency or situation to extend the deadlines.
Lab Writeup Guideline
The initial lab exercises will include:
Week 2 (Jan. 28): Orientation: Getting familiar with the lab, including
computing resources, LabVIEW, and techniques for interfacing to sensors,
calibrating them, and processing and analyzing their data. A major objective
of the lab exercise is to get familiar with signal processing using the
dataflow model of computation and to understand how to interface a desktop
computer to a physical sensor.
Week 3 (Feb. 4): Sensor modeling: This exercise will evaluate the use of
accelerometers and rate gyros for measuring tilt and tracking changes
in location. A major objective of this lab is to understand how to model
the behavior of a sensor, how to deal with noise, and how to evaluate
limitations in precision and accuracy.
Week 4 (Feb. 11): Microcontroller programming: This exercise will
use AVR tools to adapt demo programs in C and construct new programs in C
for the 8-bit microcontroller in the
iRobot Create platform. A major objective
of this lab is to familiarize students with C programming for "bare iron" (no
operating system), basic I/O techniques for interacting with sensors and
actuators, and basic design flow techniques such as cross
compiling and debugging.
Week 5 (Feb. 18): Microcontroller interfacing: Students will extend
the iRobot Create with accelerometers and/or rate gyros to measure tilt and
will calibrate the sensors to determine how effectively they can measure
tilt. The major objective is to understand the basics of hardware interfaces
to microcontrollers, polling and interrupt-driven I/O, and development of
C interfaces to custom hardware.
Week 6 (Feb. 25): Hill climbing: Students will program the iRobot
Create to climb a ramp, find the maximal point, and stop. The iRobot Create
must not fall off the edges of the ramp. The major objective
of this lab is to jointly design a modal model for the hill climbing problem
that includes both its physical dynamics and exception conditions of encountering
a cliff, and to understand the challenges of navigating without precise location
Week 7 (Mar. 3): Project management: Students will finalize project
teams and project definitions, construct a plan for the project with specific
milestones and assign responsibilities to project participants.
Weeks 8-16: Projects
Peer Evaluation Form
- Wed 4/2 - one page milestone
- Wed 4/9 - mini project updates (5 min each), new milestone prediction
- Sat 4/12 - Cal Day
- Mon 4/28 - mini project updates (5 min each), new milestone prediction (what to present for the final presentation)
- Mon 5/12 - 8:30 - 10:30 am, Project presentations and demo in the lab (125 Cory)
- Fri 5/16 - Project final report due
Guidelines for Project Presentations:
- The slot for each team is 20 minutes, including time for questions
- Time your presentation for 15 minutes; use 5-8 slides depending on how long your demo takes.
It's very important to finish on time. Practise your presentation a few times.
- Topics to cover:
- 1-slide overview of the project and your approach
- Hardware/software platform and tools used
- Main technical challenges -- in design or implementation -- and how you tackled them.
Focus on the top 2-3, at most one slide on each. Include any cool algorithms or
design insights that you came up with.
- Demo -- could be interleaved with the presentation
- Finish with a 1-slide summary: what did you achieve, what more can be done?
- Each slide must convey 1-3 key ideas; don't clutter the slide; use at least 20 pt font
Guidelines for Project Reports:
- One report per team
- Recommended length: 5-10 pages, use at least 11 pt font
- Topics to cover:
- Introduction & Problem Definition
- Outline of your Approach and how it compares to existing projects
- Algorithms and Formal Models used
- Major Technical Challenges in the Design/Implementation and how you tackled them
- Summary of Results -- what worked, what didn't, why, ideas for future work
- A one-paragraph narrative on the roles each team member played in the project
- How the project applied 2 or more of the key concepts from the course.
- Feedback: List of topics covered in class that came in handy; any topics that weren't covered that would have been useful
- Illustrate algorithms and models with diagrams, include pictures of your hardware, screenshots, etc.
- Keep the report short and precise; avoid lengthy and verbose descriptions!
Projects will be performed in teams of up to four students and should
be selected from the following list. In exceptional circumstances, student
teams may propose projects that are not on this list. All projects
should include nontrivial applications of two or more of the following
key concepts in the course:
The project report (a poster presentation and demo) should make clear which
of the above key concepts are being addressed and how.
For example, a project that addresses the fourth and last bullet could focus
on code generation from high-level modal models, and demonstrate a design
environment based on LabVIEW or Ptolemy II for an embedded target.
- modeling of physical dynamics,
- reliable real-time behavior,
- modal behavior governed by FSMs coupled with formal analysis,
- real-time networks,
- simulation strategies, and
- design methodologies for embedded systems design.
Software platforms can be "bare iron," real-time operating systems,
embedded tools like LabVIEW Embedded or Simulink/Stateflow with Real-Time Workshop,
or standard operating systems such as Windows or Linux. Attention to design quality
and reliability and trustworthiness of the end product is essential.
For many of the projects, it is highly advised to identify a mentor (typically
a graduate student) who provides familiarity with the relevant hardware
Use rate gyro and/or accelerometer sensors on a small handheld device to control
concurrent processes on the handheld device. For example, the project might design
a "writing wand" that exploits persistence of vision (POV) phenomena to create
displays in space, while simultaneously generating sounds in response to motion.
Possible mentor: Isaac Liu.
Wireless wand controller:
Use gyroscope and/or accelerometer sensors on a small handheld device to wirelessly
control real-time processes on a host machine. For example, the project might use a
Nintendo Wiimote and a bluetooth connection to a host running an RTOS or LabVIEW
Embedded to paint images on a screen, to control
a game, or to manipulate sounds in real time. A key possible elaboration would be
to combine more than one sensor input and to control actuators (such as a robot
Use real-time Linux with RTSJ (the real-time specification for Java) to
create multithreaded and/or distributed audio applications on a multicore platform.
Possible mentor: Jia Zou.
Distributed real-time systems:
We have access to three prototype Agilent P1000 boxes which combine a Linux
PowerPC computer, ethernet interfaces with high-precision clock synchronization,
and digital I/O lines capable of high-precision time stamping of input events
and high precision actuation of output events. This project would
use the P1000 boxes to create a distributed rhythm machine
that is responsive to precision-timed inputs from humans. Part of the
challenge here will be to devise suitable sensor and actuator interfaces
to the limited timed I/O mechanisms of the P1000 boxes. For example,
actuators could consist of solenoids that tap a surface and sensors might
be simple switches.
Possible mentor: Slobodan Matic.
We have a prototype of a code generator that produces embedded C code
from high-level modal models. Various possible projects elaborate
on this code generator by, for example:
All of these alternatives must target a physical platform, such as the
iRobot Create, and show nontrivial applications.
- Optimizing the generated code by pruning demonstrably unreachable states.
Possible mentor: Thomas Feng.
- Implementing a timed code generator for the Giotto model of computation.
Possible mentor: Jackie Leung.
- Implementing a code generator for the synchronous/reactive model of
Possible mentor: Jackie Leung.
Use the infrared communication mechanisms and bluetooth wireless extensions
of the iRobot Create to
design a multi-agent collision-avoidance strategy.
New interface to iRobot based on embedded 32-bit processor:
The current command module provides a good 8/16 bit controller for the iRobot. It may be desirable to develop a command module based on a 32-bit processor . The latest version of the ARM core, Cortex M3, represents a good tradeoff between 32 bit performance and 16 code density. The Luminary Stellaris® LM3S8962 (http://www.luminarymicro.com/products/LM3S8962.html) is an example of a processor with such a Cortex M3 core. For this project we will create a full development environment for such a processor, including access to existing iRobot base I/O, and new and additional I/O (e.g. 1588 built into the Ethernet controller), and integration into existing textual (e.g. C/C++) and graphical (e.g. Ptolemy, LabVIEW) programming environments. A non-trivial application would be used as example of the new environment.
- LabVIEW on Atmega
- LabVIEW Microprocessor SDK (http://www.ni.com/labview/microprocessor_sdk.htm) typically targets 32 bit processors by generating C code and specifying a targeting environment for a given processor. In this project we try to understand the challenges of targeting a smaller footprint, by targeting LabVIEW to the Atmega processor in the command module. A set of optimized routines are developed that allow the generated code to implement concurrent timed loops on the micro-controller. I/O would also be implemented on the command module or via the serial interface on the iRobot itself. A non-trivial application would be used as example of the new environment.
- Coordination of robots in different locations via GPS