CBT HOTS Assessment Instrument Model: Teacher Perception Analysis of a Role of Illustration as Well as Quiz and Question Setting According to HOTS

21st Century Learning requires HOTS learning to improve students' higher-order thinking skills, in addition to learning that promotes HOTS assessment must also promote HOTS. However, there are still many teachers who have not done HOTS who are able to stimulate students' high-level thinking skills well. This study discusses teacher perceptions about HOTS CBT assessment instrument models about the role of illustrations and quizzes and questions set in measuring HOTS. This type of research is a mixed method. The mixed method strategy used is Sequential Explanatory Strategy. This research was conducted on five high school physics teachers in Metro City. Data collection using teacher perception respondents questionnaire conducted in April 2019 by five high school physics teachers in Metro City. The data collected was analyzed descriptively quantitatively. As many as 96% of teachers strongly agree that the Computer Based Test Model (CBT) involves video illustrations, animations, simulation questions, 80% of teachers agree that describing items can enrich thinking levels (C1, C2, C3, C4, C5, C6) and GPA for considered, as well as 96% of teachers strongly agree with the form of illustration of video items, animations, complications can be removed verbalism. In addition to questions and types of questions, quizzes and question settings are considered necessary by teachers to improve higher order thinking skills (HOTS). Based on the data obtained the validation instrument was developed to obtain the best types of questions, stimuli, as well as quizzes and question settings in improving students' higher-order thinking skills (HOTS).


INTRODUCTION
Learning is a process of forming behavior due to an experience or habit that aims to improve one's competence in responding to the challenges of the times. Learning can be done anywhere, such as a school which is an institution that conducts learning systematically. The learning process at school continues to experience innovation as the demands of the times change. Current learning is directed to answer the demands of life in the 21st century that add students to have the ability to collaborate, communicate, think critically and creatively (Milaturrahmah et al, 2017). The revised 2013 curriculum is a curriculum that matches the demands of life in the 21st century, so that the learning done at school is HOTS. HOTS learning will produce students with high-level thinking skills if students are accustomed to given problems in the form of problem solving, creative thinking and creative thinking (Putri and Raharjo: 2017;Afandi et al, 2107;& Mukminan, 2014), for that assessment given must also is HOTS with the criteria of questions include categories C4, C5, and C6. The HOTS assessment measures the ability to: 1) transfer concepts to other concepts, 2) process and apply information, 3) seek connections from a variety of different information, 4) use information to solve problems (Widana, 2016;Susan M. Brookhart, 2010 ).
Based on the results of a survey that has been done shows that as many as 83% of teachers have tried to make questions that are HOTS, but the questions made only measure the ability to analyze students. This certainly cannot measure students' critical thinking skills. Another problem that makes a test not HOTS is, the form of assessment instruments given to students is only in the form of narrative statements that are less able to stimulate students' high-level thinking skills to understand a real problem. As said (Australian Education Research Council: 2015) that creativity in solving HOTS problems is the ability to evaluate existing strategies and find new solutions to solve problems.
Based on this, of course a HOTS assessment instrument is needed that is able to stimulate students' higher-order thinking skills. HOTS assessment instruments that are equipped with moving and still images, which utilize real life problems will stimulate students' high-level thinking skills (Masek and Yamin, 2011). As many as 96% of teachers agree that the Computer Based Test (CBT) Model allows illustration of items in the form of videos, animations, simulations. Making test questions that are equipped with still and moving images will be possible if the questions are no longer presented in paper prints, but instead use existing technology such as computers. As many as 80% of teachers agree that the form of item illustrations in the form of videos, animations, simulations can enrich the indicators of competency achievement (GPA) and enrich the level of thinking that will be measured. This is supported by (Andi et al, 2019) who say the use of the internet in learning is more effective in improving students' creative thinking skills. Another benefit that can be obtained by using a computer as a test tool is the presence of sensory stimuli received by students significantly from dynamic features such as video, animation, and simulation (Thelwall, 2000;Huang et al, 2009). Martin (2008) also said that audio and video feedback can help students to better understand the answers to unsolved test questions. Interactive video can improve students' thinking skills to the highest level (Pertiwi et al, 2019).
The use of computers as a means of testing has actually been done and has become an alternative evaluation tool and a supplement for conventional paper-based tests (PBTs) (He and Tymms, 2005;Smoline, 2008;& Trianta Ficeou et al 2008). Using a computer test provides several benefits such as real-time scoring (Paek, 2005), students can internalize feedback and test experiences that are more fun and better for students (Bjornsson, 2008). But the form of the questions presented by using the computer is still the same as the questions presented in the form of paper prints, that is only in the form of questions with narrative statements that do not stimulate students. As many as 92% of teachers strongly agree in the form of item illustrations in the form of videos, animations, simulations can reduce verbalism (questions that are too long). For this reason, researchers will develop tests that utilize computer technology on dynamic fluid material that will be equipped with illustrations in the form of videos, simulations, animations, images and discourse. Dynamic fluid is one of the basic competencies in physics that is widely applied in various technologies used today. One example of the application of dynamic fluid is the working principle of an aircraft. By utilizing computers, researchers will develop test questions on dynamic fluid mater which will be equipped with moving and still images, for example such as planes that will take off will be presented in the form of videos that are equipped with supporting information so that it can stimulate higher-order thinking skills the student.
The researcher also wants to know whether a type of question will affect students' high-level thinking skills, so the researcher will develop test questions into three types of questions, namely multiple response, squence, and matching. In addition to the stimulus in the form of videos, simulations, animated animations, still images, and types of questions, researchers also want to find out whether the quiz settings and question settings in a test problem can affect students' higher-order thinking skills.

METHOD
This type of research is a mixed method. The mixed method strategy used is Sequential Explanatory Strategy. This research was conducted on five high school physics teachers in Metro City. Data collection using teacher perception respondents questionnaire conducted in April 2019, supporting data obtained through the study of several journals. Teacher respondent tools for data collection are questionnaire instruments with Likert scale with answer choices "5" to strongly agree, "4" to agree, "3" to agree enough, "2" to disagree, and "1" to strongly disagree . The data collected in the form of teacher perceptions about CBT-based HOTS assessment instruments that are appropriate to be used in schools are related to the role of illustrations and appropriate quiz and question settings used to measure students' high-level thinking skills in dynamic fluid learning. Indicators of teacher perception in determining the questions that have been used or made HOTS or LOTS seen based on the characteristics of HOTS test instruments according to Schraw et al. (2011: 191). According to Schraw et al. (2011: 191) the ability of the HOTS instrument used must be able to measure the ability to analyze (C4) students, that is, students must be able to classify differences and similarities in a problem, in addition to analyzing (C5) the HOTS test must also be able to encourage students to think about the good and bad of an event, and the last ability is the ability to create (C6), namely the ability of students to create an idea or opinion.
The teacher response questionnaire was analyzed quantitatively. The score given by the five teachers in each question is divided by the maximum score multiplied by one hundred percent. The results of data analysis are interpreted according to Table 1. The data obtained were analyzed using descriptive quantitative analysis. Based on the results of teacher's perception data analysis on CBT-based HOTS assessment instruments that are suitable for use in schools with regard to the role of illustrations and quiz and question settings that are appropriate to measure students' high-level thinking skills in dynamic fluid learning, the HOTS CBT assessment instrument design model on fluid material dynamic.

RESULT AND DISCUSSION
The following are the results of the analysis of teacher perceptions regarding the HOTS CBT Instrument Instrument Instrument Instrument.  As many as 96% of teachers strongly agree the Computer Based Test (CBT) model allows the illustration of items in the form of videos, animations, simulations. 2 As many as 80% of teachers agree that illustrative items such as pictures, graphics, videos, animations, interactive simulations can enrich the diversity of problem types (true-false, plural choices, sorting, matching, short filling etc.) computer-based. 3 As many as 88% of teachers strongly agree that the illustration of the item can enrich the level of thinking (C1, C2, C3, C4, C5, C6) to be measured. 4 As many as 80% of teachers agree that the illustration of the item strongly supports assessment for learning 5 As many as 80% of teachers agree that the form of item illustrations in the form of videos, animations, simulations can enrich the competency achievement indicators (GPA) to be measured. 6 As many as 92% of teachers strongly agree in the form of illustrations of items in the form of videos, animations, simulations can reduce verbalism (questions that are too long). B. Quiz/Question Setting 7 As much as 88% of teachers strongly agree on CBT items should be raised randomly. 8 As many as 72% of teachers agree that the CBT option should be raised randomly. 9 As many as 68% of teachers agree that CBT on the computer screen should display the time spent on the test items 10 As many as 92% of teachers strongly agree that CBT on the computer screen should display the remaining time available for the test 11 As many as 88% of teachers strongly agree that each item should be weighted according to the level of thinking (C1, C2, C3, C4, C5, C6) needed to do the questions. 12 As many as 96% of teachers strongly agree that each item should be weighted according to the difficulty level of the questions. 13 As many as 56% of teachers quite agree that each item should be weighted according to the type of problem. 14 As many as 84% of teachers strongly agree on the CBT form test, it should be designed to be given feedback or feedback from the system in the form of follow-up for students who can answer the questions correctly 15 As many as 72% of teachers agree on the CBT form test, it should be designed to provide feedback from the system in the form of follow-up for students who cannot answer the questions correctly 16 As many as 84% of teachers strongly agree that there is CBT feedback from the system, it should be given for each item that has been done Question Number

Result 17
As many as 52% of teachers agreed that feedback was given after all items had been worked on. 18 As many as 88% of teachers strongly agree on CBT form tests should be given the opportunity to repeat the test. 19 As many as 48% of teachers quite agree on the CBT test, it is better to collect answers (click submit) after each item is completed, with the risk that students cannot rework the questions on it. 20 As many as 88% of teachers strongly agree on the CBT test. It is recommended that the collection of answers (click submit) be done after all items have been completed, (or after time runs out) with risk, students can rework the questions on it.
Based on Table 3 on the data analysis regarding the illustrative design of the questions that have been done, a CBT-based HOTS problem will be developed on dynamic fluid material. The question will be developed into three types of questions, namely multiple responses, multiple choices, and matching. Each type of question includes the level of thinking to analyze (C4), evaluate (C5), and create (C6). The questions will be developed complete with illustrations in the form of videos, pictures and discourse.
Based on Table 3 on the analysis of data on quiz and question settings for each type of test questions will be developed into two settings. The first set of questions will be developed by generating questions and answer choices randomly, displaying the time used, and feedback is given after each question is done. The second arrangement of questions will be developed by raising questions and answer choices that are not randomized, displaying the remaining time, and feedback is given after each question is completed. And each item will be weighted according to the level of difficulty of the questions. The CBT-based HOTS assessment instrument that will be developed will be validated by several experts with the following assessment designs: The results of the assessment in Table 4 will be used to see the type of questions and stimuli that are very well used to improve students' higher-order thinking skills. An assessment of the quiz and question setting will also be used, to see what kind of quiz and question setting can improve students' higher-order thinking skills.
The teacher also believes that the e-learning module integrated with STEM will be more innovative and practical to be used by students as according to (Afriana, 2016) that in STEM learning, students have the opportunity to study Science, Mathematics, and Engineering so that a problem-based project is needed ( Pjbl) so that students are able to solve problems and can improve critical thinking skills, one of the efforts made to develop critical thinking skills as demands of the global era is to accustom students to solving problems not only at the end of learning but at the beginning of learning by solving problems about physics, but it is an obstacle for students who don't have a personal laptop of 90% and who have a Andriond handphone of 90%. This makes the emodule more practical because it can be accessed using a mobile phone and laptop. The use of an android mobile phone is more in demand because it is a communication tool that is always used every day.

CONCLUSION
Based on the results of the questionnaire that was distributed to five high school physics teachers concluded that the CBT HOTS assessment instrument model allows illustrative questions in the form of videos, animations, and simulations, and the CBT HOTS assessment instrument models are able to enrich the level of thinking (C1, C2, C3, C4, C5, C6) to be measured, enriching indicators of competency achievement (GPA) to be measured, and reducing verbalism (questions that are too long). The data also shows the teacher's perception of a good quiz and question setting in improving students' higher-order thinking skills. So that validation instruments are developed that will assess the types of questions, illustrations, as well as quiz and question settings which are very good in improving students' high-level thinking skills.