What do you dislike about Watermark Planning & Self-Study?
One problem is that you need to know where to go to enter certain information. For example, at my institution, we are asked to provide a summary result in narrative form that directly corresponds to the target criteria for success. If I have a learning outcome and use the graph feature (met, exceeded, etc) - which is not really useful at all - I also have to write the actual result. I need to know where to click to activate both the graph and the summary. That's not very visible.
Another problem is that I like to attach supporting documentation about both my measures and results and it wasn't clear that attachments in the measure description section get pushed over year to year while the attachments in the results section do not. That is not apparently clear to me as a user. But now I know. There is no optimal place to include documentation about how results were disseminated nor is there a field entry box to explain whether and when assessment results and actions were disseminated internally or to other stakeholders.
The graph feature is only available on the learning outcomes - but even then there is no value added to having a graph that is too generic. for one, people can't really see which expectations were met (relative to what?). Additionally, the graphical representation of summary results is not "Actionable." only if results are just so bad, then graph will show some general need to improve. But otherwise, it's pretty useless (but it sure looks professional to have a graph). Why not make the graph customizable. For example, if I were to use a rubric with multiple domains or have a multiple-choice exam with multiple component areas or if I wanted to compare overall scores for subpopulations of students, then a customizable graph perhaps in the analysis section of the report would be much more useful.
The actions section while good at tracking updates, they get embedded into the report where the "Action" originated. This is not as helpful because actions that are still in progress or are actively being implemented do not get pushed over from year to year with a thread of updates on what was done and when. Therefore, the "Actions" feature is nice, but also kind of useless. It requires users to go back to the original report or to copy and paste actions into the current report year to provide an update. I want to be able to show the progression of actions as they move beyond the proposal stage to in progress to "complete" and then show how the action impacted results.
There are also no sections to propose and track actions at the programmatic level only at the measure or outcome levels. I do not like how the actions get tracked on the report at the outcome level because they look out of place. When actions are provided at the measure level they look clean and help the reader connect the dots between 1) slos 2) measures, 3) results/analysis and 4) actions.
Finally there is no section to include other updates or explanation or relevant context about the state of the program or unit that might need to be included (for example, organizational changes, new leadership, low enrollment, signature initiative at the college level). Review collected by and hosted on G2.com.