The age of taking assessments and applying for jobs using paper and pencil is long gone. Candidates expect to take assessments and apply for jobs using any device they want, which has put many organizations in a tough spot when it comes to offering assessments, especially if the assessment was designed to be taken on a traditional PC or laptop.
The demand for mobile-friendly testing gives organizations more reason to allow their candidates to take employee assessments on mobile devices, but what should you consider before introducing mobile assessments?
At SIOP 2018, we presented our research on the impact of assessment design on test performance and applicant reactions when using a mobile device rather than a PC/laptop. Here's what we found:
Mobile-First Test Design
“Mobile-first” has been a buzzword recently, spreading awareness that assessments should be designed specifically for mobile device administration. Some features of our mobile first design include:
Responsive web design – HTML and CSS code that resizes the page layout based on the users’ device screen size
Sliders instead of traditional radio buttons for Likert-type response formats and functionality that moves slider bars when a mobile device user touches and drags it on the screen
Items types that do not require cumbersome interactivity made difficult by small mobile screen sizes and touchscreens
Traditional PC Design
This assessment design was created primarily for use on non-mobile devices (i.e., devices with keyboards, large screen sizes, wired connections, etc.). This assessment involves simulations, moving parts, and timed sections.
What did we find when we compared the two assessment designs?
We found that the test design can impact candidates' test performance across difference device types.
Candidates who took the traditional PC-designed assessment on a mobile device had lower test scores than non-mobile users.
And, candidates who took the traditional PC-designed assessment on a mobile device had lower applicant reactions than those taking the PC-designed assessment on a PC.
Candidates who took the mobile-first designed assessment showed no practical difference in test scores among candidates. That is, mobile users and non-mobile users generally performed the same when taking a mobile-first assessment.
Interestingly, we found that there was no difference in applicant reactions between device users (mobile or PC) on the mobile-first design.
What does this mean?
These results show the importance of test design when implementing unproctored selection processes. Allowing candidates to take an assessment on any device they choose could put them at a disadvantage or create negative experiences when the assessment is not designed as a mobile-first assessment. When designing the assessment, test developers should have mobile devices in mind in addition to PC/laptop users, also keeping in mind that mobile-first designs tend to work well on any type of device.