FIRST
®
Tech Challenge Judge Manual | 29
Gracious Professionalism
®
- “Doing your best work while treating others with respect and kindness - It’s what makes FIRST, first.”
For Machine, Creativity, and Innovation Awards (MCI Awards), a team can simply and eloquently describe the
basis of a robot mechanism. The team can also provide documentation in the pit interview, but a clear verbal
description of the work that has been done, or the steps a team took to develop their robot, mechanism or
strategy could carry equal weight to the documented information.
It is valuable to remember that the information provided by the teams, in any form, is used to help judges
inform their decisions, rather than as a hard line for judges to follow while making their decisions. For example,
a team has met with the governor of their state, and has press photos to show the meeting, while another team
has hosted three outreach events that resulted in the formation of four teams. We do not quantify the type of
outreach that a team does in a way that makes it simple for judges to determine that one type of outreach is
more meaningful than another. We use the information received about outreach, in conjunction with the other
award attributes that the team displays.
Remember, the goal of team outreach is to further the mission of FIRST, so that we can change the culture.
Outreach events that help accomplish the mission of FIRST should be qualified as having the highest impact.
Think Award Judges
Think Award judges review the engineering portfolios for the teams that have been nominated by the interview
panel judges. They review the portfolios for content first, and only if there are multiple excellent portfolios
should they request that the team share or provide additional information. This additional information could
come from an engineering notebook, or presentation but could also be relayed verbally to the judges.
Control Award Judges
Control Award judges review the control award sheets for the teams that have been nominated by the interview
panel judges. They review the sheet for sensor use, creativity, and how the code the team has described is
effective in the robot game. Control Award judges also watch matches played by the Control Award nominees,
to ensure their code is effective, and works as described.
• In REMOTE events, judges watch the video link that the team has provided, which displays their control
component in use.
• Teams may not include links to additional content in their Control Award submission. Judges are
instructed to ignore links to code or other information provided by teams.
Judges Choice Award Judges
Teams who have not been nominated for any awards should be interviewed by the Judges’ Choice Award
panel(s). The Judges’ Choice Award panel looks for interesting stories, unique robot design, extraordinary
Gracious Professionalism, teamwork, collaboration, and other outstanding team qualities.
Match Observation
Match observers are assigned to a field and have match observer tracking sheets for each team. They watch
the teams in the match and add their comments to their tracking sheet. Match observers look for robot
performance, strategies, how a team responds to wins and losses, an abundance or absence of penalties, how
a team collaborates with their alliance partner, and other on field behaviors.
● For REMOTE events, there are no match observers.
Final Deliberations
Once the judges have had the chance to interview teams, see match play, review the engineering portfolios,
and visit the teams, the judges must come together and decide the winners of each award. Judges will meet in
the deliberation room to go through the teams that were nominated during the first deliberations to pick the
finalists and winners for each award. The goal is to remove all but six teams from each nomination list and rank
the top six contenders for each award.