
Whether it’s powered by human intelligence or artificial intelligence, the purpose of any exam security measure is to maintain fairness, create a level playing field and protect the integrity of an academic qualification. Institutions may have access to a multitude of tools to help them achieve this objective.
Faculty must return to this purpose before introducing any new technology that may impact the fairness and reliability of the measures they are implementing. PSI has always operated under the principal that proctoring student’s exams remotely is a human-centric process that can be assisted by technology, but never wholly facilitated by technology. The reliance on AI and machine learning algorithms alone, without human objectivity, for such a task like making determinations on human behavior, is a risky proposition.
Many students have attested to this point over the past year and have demanded more transparency and accountability from their institutions and the technology vendors. Join us for this session that will address some key points faculty should consider when assessed against the measures of fairness and reliability.
Click HERE to claim your badge and be one step closer to earning free swag!
