Test and Validation

 

Media check


Identifying problems with links to images and other media files is quite time-consuming, particularly for questionnaires that contain large numbers of images. The check functions located in the Test and validationMedia check menu will make your work easier.

You can display either all media, only valid media or only media that are not available. This process will check not only multimedia elements from the questionnaire, but also buttons, layout elements and images used in the templates. The display contains the following information for each media file:

  • URL

  • Preview

  • Source element

  • Context of use

  • Media type

  • Language: In the case of multilingual projects, this column contains the language of the questionnaire in which the file is used.

  • Status: The traffic-light icon indicates whether a media file is available or invalid.

Using the link in the “Actions” column, you are able to open the menus and edit the respective media file.

 

Consistency check


The Test and validation menu contains the Consistency check function. Clicking on this menu item will open an additional window with a codebook displaying the conditions for filters, hiding conditions, triggers and plausibility checks and containing information on the consistency of the setting. Traffic-light icons will tell you whether the setting is consistent:

  • Green: The setting is consistent (i.e. all variables used actually exist in the project).

  • Yellow: The yellow color is only used with filters. It indicates that the filter conditions are consistent but the filter has not yet been checked using the filter test.

  • Red: The setting is not consistent (i.e., for example, it accesses variables that no longer exist).

The consistency check covers project variables (v_100n), user-defined variables (c_000n) and URL parameters (p_000n).

 

Project test


The Project test function, located in the Test and validation menu, makes it easier for you to test the setting of the filter by automatically simulating a large number of test sessions. The resulting statistics make it easy to detect setting problems with filters and internal quotas. The following explains the usage options offered by this function:

  • Producing test data

  • Interpreting test data

  • Deleting test data

Producing test data

After you have finished your questionnaire, open the Test and validationProject test menu. This will open the entry page of the Project test menu which provides an overview of the filters and variables used in the project. Click on the Produce test data tab and enter the conditions for the automatically generated
test runs:

  • In the “Number of interviews to create” field, enter the number of questionnaire sessions that are to be automatically generated. The advisable number of sessions depends on the complexity of the project and the robustness of the server. On the one hand, high numbers of sessions (in the hundreds) deliver more meaningful results. On the other hand, projects with a complex filter structure can cause a considerable server load even with one hundred sessions. Before you enter a larger number of times participated, you should therefore make sure that there is no survey with a high number of participants running at the same time on the same installation.

  • In the second field, you can specify the “Maximum number of pages sent per session”. The number entered should be greater than 0.

  • If the “Delete test data prior to execution?” option is enabled, data generated in preliminary test runs or automatically generated test participants are deleted.

  • “Personalized survey” project type only: If the “Use copies of existing participants?” option is enabled, EFS will use existing participants, with disposition codes lower than 20, as template for creating new test participants and will loop sequentially through them until the specified amount of interviews has been created. This function allows you to test surveys, where e.g. filters and other conditions are accessing specific participant variables for routing.

  • If the project contains numeric URL parameters, you can specify individual values or number ranges for them. Please note that “Project test” does not support nonnumeric URL parameters. With projects containing such parameters, “Project test” must be run before the URL parameters are configured.

  • If the project contains triggers, you can choose separately whether these may be activated during the project test. Before activating the trigger test function, make sure that this cannot inadvertently trigger infinite loops or the bulk dispatch of mails. Usually, however, a manual test is more advisable for checking the functionality of triggers than using the automatic project test.

 

Please note that EFS does not generate test data for the User-defined (911) question type and all other question types that are based on it, namely Audio player (911), Video player (911), Slider (numbers) (911), and LUA question type (911).

Routing statistics

The routing statistics are located on the Routing statistics tab. The following guidingm questions will help you to interpret them. After the test run, first check whether the number of sessions completed (disposition code 31, 32) equals the number of times participated you originally entered. To do so, open the Routing statistics tab and click on the Evaluate complete data records only link.

  • If all sessions were completed successfully, you can limit your analysis to this statistics, labeled Evaluate completed data records only.

  • If test sessions were interrupted as in the example shown, you should first identify the cause of the drop-outs. To do so, click on the Evaluate all data records link. Note that drop-outs do not necessarily indicate problems. In the example shown, for instance, the drop-outs are participants who were screened out after a quota was fulfilled.

Delete test data

The automatically generated data must be deleted if you have completed the evaluation of the routing statistics and field report. Otherwise the data will be maintained and included in the next test session or even in the evaluation of the field phase, thus influencing the results. To do so, click on the Delete test data tab, and then confirm by clicking on Delete test data.

  • In projects with internal quotas, the test run will change the current allocation and the status of the quota under Questionnaire editorQuotas. In order to reset the allocation to zero, the project must be newly compiled after completion of the test. For this, the “Reset survey completely” option must be selected

  • If “Use copies of existing participants” was disabled for a test run in a personalized survey, then an automatically generated participant is automatically entered for each participation in the project’s participant administration. The test persons will then receive the e-mail address of the user who launched the test. You should delete these automatically generated persons and their test data before the project goes into the field phase. To do so click on the Delete test data tab, and then confirm by clicking on Delete test data.

  • If in a test run in a personalized survey “Use existing participants” was enabled or if it is an employee survey, datasets of existing participants are used for the project test. This artificially generated test data must be deleted before the project enters the field phase. To do so, you can either apply the “Reset with data deletion” action to all participants in participant administration or compile the project anew (navigate to Projects → {Selected project} → Compile and select the “Reset survey completely” option).

 

Project check


The project check informs you about the progress and faults made during project creation.

  • The upper part of the dialog contains an overview on central topics.

  • Furthermore, questionnaire structure, filter conditions, timing etc. are checked for logical consistency. Potential problems are listed with the respective problem grade.

  • The Project check or Check layout cannot be used for the “Responsive Layout”. The checks are optimized for the scope of the classic system layout.

Check

Meaning

Check

Meaning

Errors while processing the survey

EFS checks for errors in LUA filters, quotas and triggers while processing the survey.

Current selected language

Indicates the standard language.

In multilingual projects: Translation status

Indicates whether all of the text elements of the different survey languages have been filled and has a link to the overview of the To-dos.

Only for projects with classic system layout: Layout status

Provides information on the correctness of the layouts in use.

If the layout is no longer current, you have the option of clicking on the Check layout link to switch to the Pro editor, where you can fix the problem.

Number of variables in survey table

Number of variables. The number may change when compiling the project.

 

Reset survey


All data created e.g. by tests in productive mode must be deleted before the start of the survey. Open the Test and Validation → Reset survey menu. Select “...reset survey completely...” and click on Reset survey. Now the database is cleaned and all test data deleted.

Complete resetting of a survey before starting into the field

Before starting into the field, i.e. the start of the proper data collection phase, the survey is usually reset completely to clean various types of unwanted data from the database. For this purpose, the option “Reset survey completely and delete all survey data collected so far” is used.

  • Usually projects are reset completely before starting into the field to remove any results created during the test phase. This is recommendable, in particular, if you have created real data records while testing an anonymous project or if you have used not only dedicated test accounts but true participant accounts when testing a personalized project.

  • If superfluous questions or pages were created and delated later, the variables of these questions or pages are kept in the database. When resetting prior to the field start, these superfluous variables are displayed and you can optionally delete them. They can also be used to restore data. In principle, it is no problem to keep superfluous variables in the database. But in big, complex projects this clean-up may improve the performance.

  • If you have used the pretest feature, it is optionally possible to delete the comments as well.

Resetting in later stages

As soon as the survey is in the field, i.e. as soon as the data collection starts, the Reset feature should in principle not be used anymore. In exceptional situations, it may be necessary: then, you must use the option “Keep result data collected so far”.

  • Should anybody use “Reset survey completely and delete all survey data collected so far” by mistake in a running project, all results would be lost irrevocably!

  • Variables that already contain information should not be deleted under any circumstances, too.

If you need to reset the data records of specific participants, please use the dedicated actions in participant administration and in the member list of samples.

Resetting test accounts in projects with participant administration or sample

In personalized projects, the option “Delete test participants and their result data” can be used to selectively remove all participants flagged as testers and the respective result data. In panel surveys and master data surveys, survey status and result data of panelist test accounts will be reset, but the testers will remain in the sample.

© 2024 Tivian XI GmbH