Glossary of terms

(A)

Accessibility: For Certiport, this means designing our software, exam content, documentation, and websites to be as compatible as possible for persons with disabilities.

ACU: Autodesk Certified User

(B)

 

Beta: Betas are preliminary versions of certification examinations given to actual certification candidates. They are conducted by Certiport to collect data and evaluate certification examination items. The statistical results of item performance, along with candidate feedback, help Certiport select the best assessment items to use in the final version of the certification examination.

Blueprint: A detailed, written plan for a certification examination that typically includes descriptions of the certification’s purpose and target audience, the content or performance areas it will cover, the types of items and number to be written for each content or performance area, and other characteristics.

(C)

 

CAP:Certiport Authorized Partner

CATC: Certiport Authorized Testing Center

Certification: An official recognition that an individual has achieved a particular level of knowledge or skill in a specific content area.

Constructed response: A certification item where the test candidate is required to supply an answer rather than choose it from a list of responses. Examples of constructed response items include essay, fill-in-the-blank and short answer.

Criterion-referenced assessment: Typically assesses a given content area in depth (several items per topic within the content area). Test content is closely aligned to a set of content and curriculum standards. In addition, a set of cut scores are developed to define different levels of proficiency within that content area. Student performance is compared to these standards rather than to the performance of other students.

Cut score: The final pass/fail point determined for an exam. It is the lowest possible raw score a candidate must achieve to be classified as proficient.

(D)

 

Distractor: A response option that is not correct. It is intended to provide a candidate with a plausible, yet incorrect, alternative to the correct answer.

(E)

 

ESB: Entrepreneurship and Small Business Exam

ECA: EC-Council Associate

Equating: A statistical procedure that allows valid comparisons to be drawn across exam forms. Equating is performed to address minor differences in difficulty across multiple forms.

Exhibit: Additional information (text or graphics) provided for use as a reference in one or more items during a test. In a computer-based test, an exhibit is typically displayed in a separate window from the item; accessing it is optional.

(F)

 

Form(s): For exam security, we create multiple but similar versions of an exam, which is referred to as a form. This allows candidates to get a different version if they take the exam more than once. The forms are equated so that one is not harder than another one.

(G)

 

GS5, GS6: GS stands for Global Standard. This is a designation of our IC3 Digital Literacy exams, where the number would indicate the generation of the certification.

(I)

 

IC3: Internet Core Competency Certification otherwise known as Certiport’s Digital Literacy Certification.

Item: The generic term for an individual question or task that forms part of an exam.

Item analysis: Statistical analysis of candidates’ responses to examination questions. It is done for the purpose of gaining information about the quality of the examination questions.

(J)

 

Job task analysis (JTA): A systematic process used to determine what competencies are required to perform a particular task or job. It generally involves interviews, surveys, or focus groups to identify the primary concepts, job responsibilities, tasks, and subtasks required. Certiport uses JTAs to guide the selection of content areas to include on a certification.

(L)

 

Live-in-the-Application exams (LITA): A type of exam where the application is installed on the testing machine. The test candidate uses the actual software application to complete requested tasks within the exam.

Localization: The process of translating content and UI into a variety of languages.

(M)

 

MCE: Microsoft Certified Educator

MCP: Microsoft Certified Professional

MOS: Microsoft Office Specialist

(O)

 

Objective domain: The skills and abilities that need to be proven in any given certification examination. View our Objective Domains page.

Organization Administrator: A role within the Certiport Authorized Testing Center account that is used to manage account associations, make purchases, or run reports. It should not be confused with a Windows Administrator, which is a network level user account with high-level access.

(P)

 

Psychometrics: A field of study concerned with the theory and technique of psychological measurement.

(Q)

 

QBCU: QuickBooks Certified User

(R)

 

Reliability: A measure of the consistency or stability of test scores for a group of test-takers over time, administration conditions, examination forms, or samples of items.

(S)

 

Scaled scores: The method used to allow candidates to see if their performance is improving when a retake is needed. Most Certiport exams are scaled to a passing score of 700. The actual number of questions that you need to answer correctly in order to pass an exam is determined by the difficulty level of the questions and our expectations of the skills and abilities of the target audience. We transform the number of questions needed to pass into a scaled score of 700 for most Certiport exams.

Selected/Standard response: A type of certification examination item in which the candidate is required to select the correct answer from a list of answer options. Examples of selected response items include multiple-choice, multiple-selection, matching, drag & drop, sequencing/list, fill-in-the-blank and hot area (graphic multiple choice).

Skill Group: During the creation of a certification examination, subject matter experts define the skills that would demonstrate the target audience for certification. These items are evaluated as to functional importance and broken into topics which we reference as sections or skill groups. If skills within a section are considered essential to the subject, it will contain more items than a section considered less fundamental. It will also have items considered to require a higher degree of subject knowledge. Therefore, each section has a varied number of items which also vary in level of difficulty.

Simulation: A type of exam item where certain aspects of an application are re-created in order to simulate being in the actual program.

Sitting: Since there are many stages to exam delivery including registration, preparation, learning, and scoring, this is a term used to describe the time a Test Candidate is actually “in their seat” taking a certification exam.

Standard setting: Committee-based process for acquiring or refining achievement level descriptors and determining cut score recommendations for an exam. Subject matter experts confirm that test-takers who pass the exam have achieved a justifiable and defensible standard.

Stem: The question portion of an item; it presents the problem to be solved.

Subject Matter Expert (SME): A person who provides expertise in a specific field of knowledge.

(T)

 

Task: A request within an exam item for a test candidate to demonstrate a specific skill.

(V)

 

Validity: The degree to which available evidence and theory support scoring interpretations for a test. There are many aspects of test validity: content, construct, concurrent, and predictive.


Developing a certification examination: