-
Notifications
You must be signed in to change notification settings - Fork 5
GUI for grading projects POA
Issue #208 requests a graphical user interface for grading student submissions. This will improve the speed at which Grader who are not expert at a text edit can grade projects. This Plan of Attack lays out the workflow and designs some of the implementation of the applications that enable the workflow.
Each project submission is compiled, packaged, and executed against a series of test cases. The output of these test cases for the foundation of what the project submission grader application works with. This output can be described as follows:
class ProjectSubmission {
String projectName; // Such as "Project2". This is should match what the Submit program uses
String studentId; // Such as "whitlock"
String mainClassName; // Such as "edu.pdx.cs410J.whitlock.Student
DateTime gradingTime; // When the submission was graded
double maxScore;
double score; // Can't exceed maxScore
String graderComment;
List<TestCaseOutput> testCases;
}
class TestCaseOutput {
String testCaseName; // Such as "Compiling" or "Test 1"
String testCaseDisplayName // Such "Compiling source code" or "Test 1: No arguments"
String commandLine; // The command line that was executed
List<TestCaseFile> filesBeforeTestIsRun;
String programOutput;
List<TestCaseFile> filesAfterTestIsRun;
double pointsDeducted; // Number of points lost due to the behavior of this test case
String graderComment; //
}
class TestCaseFile {
String fileName;
String fileContents;
}
Currently, the test cases are implemented using shell scripts that invoke mvn
and java
commands. The output of these shell scripts is a text file that is human-readable. Ideally, the tool that runs the test cases would output a file in a format (XML?) that could be easily read by the project grader GUI.
However, changing the way tests are execute could be fraught with error and could take a lot of time to implement.
As a first step, let's implement a tool that parses the text file outputted by the current scripts and converts them into file format that is compatible with the project grader GUI.
Does someone want to design an XML DTD or Schema for the project submission objects?
We still want to execute the test cases for the project submissions on the PSU Linux machines because the provide an environment that is consistent and is available to everyone involved in the course. However, a thick GUI client cannot be run efficiently on a remote machine. Therefore, the submission grading program must be run on the Grader's local machine.
So, how do the test case output files get from the PSU machines to the Grader's local machine? The project submission test case output files contain sensitive information and we don't want to expose them via HTTP. We could use a tool like scp
, but it would be really nice if they were under version control so that we go back and look up older versions of the project output. That way, when a student resubmits, we can compare the output of their current submission with previous submissions. Version control will also detect if two Graders (or a Grader and Dave) inadvertently grade the same submission. However, VCS tools like Subversion or Git require that a server run in order to check out and check in file. This means that the submissions (test outputs and scores) reside on a server which is likely outside of the control of Dave and the Graders.
There is another option (from the 1990s!): CVS. CVS is an old-school version control system that uses ssh to send files between a checkout and a repository. The CVS repository can reside on the CS Department's Linux machines on the Grader's account. Since the Graders can ssh to this account, they will be able to check out the CVS repository onto their local machine. Once the submissions have been scored, the Grader can commit the changed submission files back to the repository using CVS.
Once the project submission files with test case output have been transferred to the Grader's local machine, he or she can launch the project scoring GUI. The user interface will look something like this:
The Grader can view a list of all project submissions. The id of the student is listed.
Until a project submission is graded (that is, assigned a score), it is displayed in bold in the list to draw attention to the fact that requires action.
Clicking on a submission displays the test cases in that submission and details about the first test case.
The Grader can view a list of the (short) names for the test cases in the selected submission.
In addition to the name, the Grader can easily see how many points (likely portion of a point) was deducted from the overall grade because of the behavior of that test case.
When the name of the test case is clicked, details about the test cases are displayed.
The Grader is able to view all of the information about a test case and the output of the program when the test case was exercise. From this information, the Grader is able specify a number of points (likely a portion of a point) that should be deducted from the submission's score. The Grader is also able to make a comment that is specific to that test case.
The Grader can easily advance to the next test case in the submission.
Once the Grader has scored all of the test cases, he or she can assigned an overall score for the project. By default, this value is computed to be the maximum score for the project minus the sum of all of the points that were deducted because of the behavior of the test cases. The Grader may also choose to enter a particular overall score that differs from the sum.
The Grader may also comment on the overall submission.
The Grader can easily advance to the next un-scored project submission.
When a score is assigned to a project submission, the score and any other information that has been provided by the Grader is written back to the file that describes the submission.
This is accomplished easily when the submission files are under version control.
We'll need a tool that converts an XML file into a human-readable report. The report will probably look a lot like the .out
files that are generated by the grading shell script.
There is nothing new needed here. We can continue to use the StudentFileMailer
class as long as the name of the file begins with the student's id.
The Model/View/Presenter (MVP) design pattern separates the code that works with an application's data and logic from the UI widgets that display the data and interact with the user. MVP allows the UI to tested using lightweight unit tests because the majority of code is plain old Java. Additionally, using a message bus allows the presenters to be decoupled so that they may be tested independently of each other. This pattern takes some getting used to, but ultimately results in isolated components of logic than can be independently tested and evolved.
Views are specified as an Java interface that has methods to modifying what is displayed in the UI and registering callbacks that are invoked when the user interacts with the UI. For instance, the SubmissionsView
which lists the names of the submissions might look something like this:
interface SubmissionsView {
void setSubmissionNames(List<String> submissionNames);
void setSubmissionNameSelectedListener(SubmissionNameSelectedListener listener);
interface SubmissionNameSelectedListener {
void onSubmissionNameSelected(String submissionName);
}
}
The SubmissionsPresenter
listens for a SubmissionsLoaded
message and invokes the View
's setSubmissionNames
method with the appropriate list of names. When a submission name is selected, the SubmissionPresenter
send a SubmissionSelected
event/message on the message bus.