Introduction

This section describes how to use the LabHQ LIMS Lifecycle to effectively record and process data for laboratory testing.

Create Study

To create a Study, select the icon in the top right hand corner, or select ‘Create Study’ from the LIMS menu and the Create Study page will appear (See Figure 1 below).

Figure 1: Create Study

Figure 1: Create Study

 

The user can then enter the following data:

Product – This field is a dropdown list which is populated from the list of products loaded into the system. Products in this list will only display where the user account is associated with a client that is then associated with the product. This is a mandatory field.

Client – This field appears where there is more than one client associated with the product selected.

Manufacturer – A list of manufacturers will appear where a client that contains manufacturers has been selected.

Study ID – The study ID is generated after creating the study.

Description – A free text field for the user to enter a description for the study.

Batch Number – An alpha-numeric field to record the batch number for the study. This field is mandatory.

Workflow – This is a mandatory field and will display the default workflow setup for the product.

Manufactured Date – This will default to today’s date.

Product Additional Fields Where additional fields have been set up for a product, these fields will appear for the user to enter values. If an additional field is mandatory, the field will be highlighted in red until the user enters a value. The user will also be prevented from creating the study until all mandatory fields are entered.

When the user has completed entry of all the mandatory fields, the [Create Study] button will become active. When the user selects [Create Study], a pop-up will appear informing the user ‘Study NN has been created.’
Where the batch number is already existing in the system, a warning will appear, ‘Batch number already exists for a Study. Do you wish to proceed?’ The user has the option to select ‘No’, which allows the user to edit the batch number, or the user can select ‘Yes’ to proceed.

Receive Samples

Sample receipt may be used to identify the point at which samples have entered the laboratory, without the need for completing registration. This enables the generation of accurate metrics (e.g. efficiency monitoring). A user can receive samples by selecting the Receive Samples icon or selecting ‘Receive Samples’ from the LIMS menu.

When the Receive Samples page is displayed (See Figure 2 below), a list of studies awaiting receipt is populated in the left pane, along with the study batch number and product. When the user selects a study from the left pane, all jobs and corresponding samples for the study are displayed in the grid on the right.

 

The Receive Samples grid is grouped by job ID, and any samples which are awaiting receipt will be displayed beneath their relevant job ID. Job details include job template name and description, the number of samples available within this job (NB: if a sample is received separately to other samples within a job, this sample will still be included in the sample count at Receive Samples), client name, created date and client/product job additional fields (if applicable).

Information displayed on the sample level includes sample number, sample template name, test suite, number of labels to print and client/product sample additional fields (if applicable). The Test Suite drop down displays a list of test suites assigned to the sample template at product setup, and will automatically set the selected test suite to the default, however the user may amend the test suite if required. The No. of Labels field can be used to define the number of labels to print for each sample received. Again, this field is automatically set to the default value configured on the sample template at product setup.

Before receiving samples for a study, the study can be assigned a Person Responsible. This field automatically defaults to the user carrying out the sample receipt, but this can be amended using the drop down menu which contains a list of all LIMS users. This is usually the person responsible for overseeing that all samples in a study are processed (NB: this does not have to be the same user responsible for sample testing).

To receive a sample, check the box to the left of the desired sample ID, or check the select all tick boxes which can be found next to the job ID or study ID to select multiple samples at once. Selecting the tick box next to the study ID in the header will automatically select all jobs and samples in the grid, whereas selecting the tick box next to a job ID will only select the single job and any corresponding samples. Once all desired jobs/samples are selected for receipt, select the [Receive] button.If the study workflow is configured with alerts, a prompt will be displayed to check if notification is required.Selecting [Yes] will trigger a user alert, or

in the case of an email alert the recipients of the email can be specified and the subject/message body may be amended prior to sending. Alternatively, if a notification is not required the user can select [No] on the alert prompt.

It is possible to receive jobs and samples within a study at different points in time. If not all samples are received for a study, the study will remain in the left pane until all samples have been received. Once all samples have been received for a study, the study will be removed from the left pane. Following receipt, additional samples may be added via View Samples.

Printing Sample History

The [Print History] button in the page footer allows the user to print a report listing all previously received samples with a matching batch number (See Figure 3 below). This report allows users to determine the type of testing required by looking at the testing which has been carried out previously on samples belonging to the same batch of product. The Product Sample History report shows the job name, sample IDs, test suite carried out for each sample, received date and completion date (if applicable).

Product Sample History Report

Figure 3: Product Sample History Report

Adding a Job

A job template may be added to a study prior to receipt, using the [Add Job] function. The Add Job drop down menu at the footer of the page includes a list of job templates assigned to the product. Selecting a job template from the drop down and selecting the [Add Job] button will add a job and corresponding samples to the grid ready for receipt. The job and sample ID’s will initially appear as ‘Unsaved’ until the jobs and samples are either saved or received, at which point they will be assigned a unique sequential ID (See Figure 4 below). A test suite for the samples within the job cannot be selected until the job is saved.

Figure 4: Receive Samples - Add Job

Figure 4: Receive Samples – Add Job

To save the newly added job, select [Save]. After saving, the grid will be updated to display the ID of the job and its samples. It is also possible to receive a new job without saving the details first, by selecting the new job and [Receive] the job details are saved, job and sample ID’s assigned and the items removed from the Receive Samples page.

Adding a Sample

A sample template may added to a job prior to receipt, using the [+] button under the job details. Selecting the [+] button will open the Add Sample window (See Figure 5 below), where the user can select a sample template along with the number of samples to add. Selecting [OK] will add the required number of samples to the grid ready for receipt. The sample ID(s) will initially appear as ‘Unsaved’ until the samples are either saved or received, at which point they will be assigned a unique sequential ID.

Figure 5: Add Sample

Figure 5: Receive Samples – Add Sample

To save the newly added sample(s), select [Save]. After saving, the grid will be updated to display the ID of the sample. It is also possible to receive samples without saving the sample details first, by selecting the new sample(s) and [Receive] the sample details are saved, sample ID’s assigned and the items removed from the Receive Samples page.

Removing Jobs/ Samples

The jobs and samples generated for each study come from the job template configured on the product. In some cases, not all jobs and samples will be required for testing and it may be required to remove a job or sample from the study prior to receipt. To remove a job or sample, select the [x] button to the left of the job or sample ID.

The job/sample will be removed from the grid, but in order to confirm this action the user must either select the [Save] or [Receive] button. Upon saving or receiving, the user will be required to enter a reason for deletion.

NB: It is not possible to delete the last remaining sample in a job or the last remaining job from a study. Where a job or sample is the last remaining item, the [x] button will not be available. In cases where a study or job are no longer required, a user may retire a study via the View Studies screen or retire a job at Retire Jobs.

Register Samples

The purpose of sample registration is to ensure that the details entered upon sample submission are correct and to assign tests to the samples. A user can register samples by selecting the Register Samples icon or selecting ‘Register Samples’ from the LIMS menu.

Figure 6: Register Samples

Figure 6: Register Samples

 

When the Register Samples page is displayed, a list of jobs awaiting receipt is populated in the left pane, along with the study ID, batch number, product and client. When the user selects a job from the left pane, all corresponding samples for the job are displayed in the grid on the right (See Figure 6 above). The sample details include the following:

  • Sample ID
  • Sample Number
  • Product Name
  • Sample Batch Number
  • Test Suite
  • Workflow
  • Submitted Date
  • Date of last update

To register the samples, tick the box to the left of the desired sample ID, or use the Select All tick box to select all samples of the job and select [Register]. The individual sample registration screen will be displayed (See Figure 7 below) where the sample details (with the exception of the Product) may be amended. The test methods for the chosen test suite are displayed and these may be removed by un-ticking the box to the left of the test method name. Alternatively, test methods may be added by selecting the appropriate item from the drop down menu under the Test Methods box and selecting [Add]. After adding a new test method, the method will appear selected in the Test Methods box.

Register Samples 2

Figure 7: Register Sample – Sample Details

 

If a comment on the certificate of analysis is required (outputs may vary if using custom reports), the desired text can be entered in the Comments field, e.g. ‘Sample leaked during shipping’.

The contracted and scheduled completion dates are to notify the client of the completion dates for reporting and the generation of laboratory metrics. The default dates are defined by the contracted turnaround at Product setup and may be amended during the sample registration process.Sample labels may also be re-printed by selecting the number of labels to print and the [Print Sample Labels] button. A reason for reprinting labels will be required.
When all details for the sample are correct, select [Register] and the details for the next sample will be displayed. If the study workflow is configured with alerts, a prompt will be displayed to check if notification is required. Selecting [Yes] will trigger a user alert, or in the case of an email alert the recipients of the email can be specified and the subject/message body may be amended prior to sending. Alternatively, if a notification is not required the user can select [No] on the alert prompt.

Create Work

Workbooks can be created to capture analytical raw data during sample analysis. This may include equipment used during analysis and any data required for calculation of the final result. Workbooks can consist of one or more sample tests across multiple studies. A user can create a workbook by selecting the Create Work icon or selecting ‘Create Work’ from the LIMS menu.

Figure 8: Create Work

Figure 8: Create Work

 

On initial entry to the Create Work page, the left pane is populated with a list of LIMS users and the current user is auto-selected from the list (See Figure 8 above). The data that appears in the tabbed pages on the right is filtered based on the groups associated with the user selected in the left hand pane. Therefore, work cannot be assigned to an analyst where their group is not asociated with a Test Method. There are four tabs which can be used to search for sample tests to add to a workbook:

  • Study
  • Job
  • Sample
  • Test Method

Each tab contains a Search field, where the user may search on any of the columns displayed in the table. The Study tab displays a list of studies which have been registered in LIMS and are currently awaiting testing. Study details include:

  • Study ID
  • Study Description
  • Study Batch Number
  • Product Name
  • Product Description
  • Manufactured Date
  • Contracted Completion Date

Selecting a study from the table then populates the Selected Test Methods box on the right with the associated sample tests. It is possible to add more than one study to a single workbook. Where more than one study is selected for a workbook, the sample tests for each study are added to the list of selected test methods on the right.

The Job tab displays a list of jobs which have been registered in LIMS and are currently awaiting testing. Job details include:

  • Job ID
  • Job Name
  • Job Description
  • Study Batch Number
  • Product Name
  • Product Description
  • Manufactured Date
  • Contracted Completion Date

Selecting a job from the table then populates the Selected Test Methods box on the right with the associated sample tests. It is possible to add more than one job to a single workbook. Where more than one job is selected for a workbook, the sample tests for each job are added to the list of selected test methods on the right.

The Sample tab displays a list of samples which have been registered in LIMS and are currently awaiting testing. Sample details include:

  • Sample ID
  • Sample Name
  • Job ID
  • Sample Batch Number
  • Product Name
  • Product Description
  • Manufactured Date
  • Contracted Completion Date

Selecting a sample from the table then populates the Selected Test Methods box on the right with the associated sample tests. It is possible to add more than one sample to a single workbook. Where more than one sample is selected for a workbook, the sample tests for each sample are added to the list of selected test methods on the right.

The Test Method tab displays a list of test methods for samples which have been registered in LIMS and are currently awaiting testing. Test Method details include:

  • Test Method Name
  • Test Method Description
  • Sample ID
  • Sample Batch Number
  • Product Name
  • Product Description
  • Manufactured Date
  • Contracted Completion Date

Selecting a test method from the table then populates the Selected Test Methods box on the right with the associated sample test. It is possible to add more than one test method to a single workbook. Where more than one test method is selected for a workbook, the sample tests for each test method are added to the list of selected test methods on the right.

NB: Moving between search tabs will refresh the Selected Test Methods, and items cannot be added to a workbook via more than one tab at a time.

The Selected Test Methods table displays the following sample test information (See Figure 9 below):

  • Sample ID
  • Test ID
  • Test Method
  • Trace item
Figure 9: Create Work - Trace Item

Figure 9: Create Work – Trace Item

 

If a sample test selected is no longer required, this may be removed from the workbook by deselecting the tick box from Selected Test Methods. Where this sample test has been added using the Study, Job or Sample tabs, the whole item will remain selected, but the single sample test deselected will not be added to the workbook. The study, job or sample will remain on the Create Work page until all sample tests have been assigned to a workbook.

The Trace column can be used to mark a sample test within the workbook as the trace item. If using LIMS alongside either the Stock or Equipment modules, the trace item will hold details of stock and equipment items used during testing and these can be recorded at Enter Results. Where stock and equipment are recorded against the trace sample test, the remaining sample tests for the selected test method in the same workbook will be populated with the stock and equipment data also.

Once all workbook tests are added, select the [Create] button to create the workbook. Following creation of the workbook, a prompt will be displayed which contains the workbook ID. The [Print Workbook] button in the page footer will also become active to allow the user to print the report. The button displays the workbook ID of the workbook just created. Selecting the [Print Workbook] button will open a printer selection window, where the user can select a printer to print to. Once the [Print] button is selected, the workbook will be sent directly to the printer.

Edit Work

Workbooks may be edited following creation via the Edit Work page (See Figure 10 below). The editing of workbooks can include amending the user originally assigned to, adding new tests or removing existing tests. A user can edit a workbook by selecting ‘Edit Work’ from the LIMS menu.

Figure 10: Edit Work

Figure 10: Edit Work

 

On initial entry to the Edit Work page, the left pane is populated with a list of existing workbooks which contain items currently in test. NB: A workbook may not be edited after all sample tests have had results entered.
The Workbook details in the left pane include the Workbook ID assigned after the workbook is created, and the user that the workbook is currently assigned to.
When a workbook is selected from the left pane, the sample tests currently assigned to this workbook are listed on the right. The sample tests are displayed in the order that they were added to the workbook at Create Work. Details of the existing sample tests include Sample ID, Test ID, Test Method Name and Description, Sample Batch Number, Product Name and Description, Manufactured Date of the product and Contracted Completion Date of each sample.

Re-assigning Work

In cases where work has been assigned to the wrong analyst, or work is due to be completed by another user, workbooks may be reassigned to different users at Edit Work. To reassign a workbook to another user, first locate and select the workbook from the left pane. The Assigned To field in the header will display the name of the current user assigned to the workbook. To change the user, select the drop down menu and select the name of the user to re-assign the workbook to. Once the correct user is selected, select the [Save] button to save the workbook. A reason for change will be required on saving the workbook.The user making the change will also be prompted whether or not to continue based on the fact that the new user may not be assigned to the correct groups required to view the sample tests in the workbook. If the user selects [No], changes to the user will not be saved, however if the user confirms it is OK to continue, changes to the workbook are saved.

NB: If the user assigned to a workbook is amended, the workbook will no longer appear under the original user workbooks at Enter Results. Once changed, the workbook will only be available under the user it is currently assigned to when entering results.

Adding Tests to a Workbook

To add a new test to an existing workbook, first locate and select the workbook from the left pane. Select the [+] button at the bottom of the list of tests and the Edit Workbook Test Methods window will be displayed (See Figure 11 below).

Figure 11: Edit Work - Adding Tests

Figure 11: Edit Work – Adding Tests

 

Similar to the Create Work page, there are four tabs which can be used to search for sample tests to add to the existing workbook:

  • Study
  • Job
  • Sample
  • Test Method

The tabs displays a list of items which are registered in LIMS and are currently awaiting testing. Selecting an item from the table populates the Selected Test Methods box on the right with the associated sample tests. The sample tests already assigned to the workbook are also displayed at the top of the Selected Test Methods box, so that the user can see the original tests and any newly added tests.

The Trace item can also be amended when adding new tests. Simply tick the check box of the desired trace item, and the original trace item will automatically be deselected. NB: It is only possible to assign one trace item in a single workbook.
To save the workbook with the newly added tests, select the [Save] button in the bottom right of the window. A reason for change will be required on saving any changes. The user amending the workbook will also be prompted to re-print a copy of the workbook upon saving. Once saved, the new test will be displayed added to the list of workbook tests. Alternatively, if a user no longer wishes to make amendments, added tests can be cancelled by selecting the [Cancel] button and the workbook tests will not be affected.

Removing Tests

To remove a test from a workbook, select the [x] button to the left of the Sample ID of the sample test and the test will be removed from the workbook list. Multiple tests can be removed at a time by selecting the [x] button next to all sample tests, and then saving the workbook. To save the workbook changes, select the [Save] button in the bottom right of the window. A reason for change will be required on saving changes.

Printing Workbooks

The Edit Work page can also be used for printing workbooks. To print a workbook, select the existing workbook from the left pane and select the [Print Workbook] button in the page footer. Where it is not the first time a workbook has been printed, including initial print on creating the workbook, the user will be required to enter a reason for reprint. Date and time of individual prints will be stored in the workbook audit trail (see Workbook Audit), along with user carrying out the print, printer sent to and reason for reprint. A printer selection window will then be displayed, where the user can select a printer to print to. Once the [Print] button is selected, the workbook will be sent directly to the printer.

Workbook Audit

Where changes are made to a workbook, such as adding or removing tests, or assigning to a different user, changes will be recorded in the Workbook Audit report. This report can be accessed via the [Change Log] button at Edit Work, or alternatively via the Reports menu (Reports > Quality > Change Log > Summary). The Workbook Audit displays the date and time of the change, the user who carried out the change and reason (See Figure 12 below).

 

Figure 12: Workbook Audit

Figure 12: Workbook Audit

The printing of workbooks is captured in the Workbook Printed change log, which can be accessed via the Reports menu (Reports > Quality > Change Log). The Change Log window allows the user to select a user, if necessary, an event type and date range to view changes within. To view the Workbook Printed change log, select ‘Workbook printed’ from the Event Type menu, and select a user and date range. When generated, the report displays the date and time of print, the user who carried out the action, printer name and reason for reprint (if applicable).

Enter Results

Sample data can be entered by selecting the Enter Results icon or selecting ‘Enter Results’ from the LIMS menu.

On initial entry to the Enter Results page, there are two tabs which can be used to locate and select sample tests for results entry (See Figure 13 below). The User tab is automatically selected, displaying a list of users with workbooks assigned which require results entry. Selecting a user from the left pane expands the workbook list under the user’s name. All Workbook ID’s assigned to the selected user which contain sample tests awaiting results entry will be displayed below the user’s name. Selecting a Workbook ID from the left pane then expands the test method list. All test methods assigned to the workbook which have sample tests awaiting results entry will be displayed beneath the Workbook ID.

Figure 13: Enter Results by User

Figure 13: Enter Results by User

 

Alternatively, results can be entered by test method for all workbooks using the Test Method tab (See Figure 14 below). The left pane displays a list of test methods which contain sample tests awaiting results entry. A sample test will be assigned a test method based on the version number of the test method at the point of sample registration. For this reason, each test method is displayed with its corresponding version number in the left pane.

Figure 14: Enter Results by Test Method

Figure 14: Enter Results by Test Method

 

Selecting a test method from the left pane populates the results grid on the right with associated sample tests. Where a sample holds multiple test methods, each test is assigned a unique Test ID. In this case, some sample tests will hold the same Sample ID, but a different Test ID. The results grid can display the following test details:

  • Sample ID
  • Test ID
  • Sample Batch Number
  • Product Name & Description
  • Test Date
  • Test Method Additional Fields
  • Stock (if applicable)
  • Equipment (if applicable)
  • Inputs
  • Outputs

Additional sample data can also be seen when placing the mouse cursor over the Sample ID in the results grid. The tool tip on the Sample ID will display the Study ID and Job ID that the sample belongs to.

For test methods with larger number of inputs and outputs, the scroll bar can be used to view all columns. The sample data may be filtered using any of the columns, including inputs and outputs, by selecting the filter icon to the right of the column header name.

Data can be entered directly into the cell, or pasted from an external source using the standard Microsoft shortcut keys (Ctrl + V). Data from a single cell may be copied into multiple cells in one action by highlighting the single cell and selecting [Copy] using the right hand mouse button or ‘Ctrl + C’. To paste the entry to multiple cells, highlight all of the cells into which data is to be inserted by clicking and dragging and select ‘Ctrl + V’. Where a single cell is to be copied down a column, the ‘Copy Down’ function can be used. By right clicking on the cell to be copied and selecting [Copy Down], the user receives the following options:

  • Copy Down to Next Row – selected cell is copied down to one cell below.
  • Copy Down to All Rows – selected cell is copied down to all rows in the grid.
  • Custom – selected cell is copied to user defined rows in the grid.

Selecting the ‘Custom’ function opens a Copy To window which displays a list of Test ID’s available in the results grid. The window also displays current values for each Test ID, so that a user is aware if there is existing data in a cell. All selected Test IDs will be marked as a row to copy to. Once all required Test IDs are selected, select [OK] and the results grid will automatically be updated with the copied values.

Outputs are automatically calculated and displayed once all required inputs have been entered. Any values that are calculated outside of the upper or lower limit are displayed in red, and any values outside of the upper or lower limit warning are displayed in blue. Alternatively, if a specification holds a target value, any values that do not match the target will be displayed in red. If a specification, target and limits are available for the product, then the details may be displayed by placing the mouse cursor over the relevant output.

Examples of output result display can be seen in the table below.

Output Type: String
Test Specification Component Result Display Example
Target Equal To In Specification Target: Complies

Result: Complies

Not Equal To Out of Specification Target: Complies

Result: Does Not Comply

Output Type: Numeric
Test Specification Component Result Display Example
Target Equal To In Specification Target: 10

Result: 10

Not Equal To Out of Specification Target: 10

Result: 11

Target: 10

Result: 9

Lower Limit Equal To In Specification Lower Limit: 10

Result: 10

Greater Than In Specification Lower Limit: 10

Result: 15

Less Than Out of Specification Lower Limit: 10

Result: 5

Upper Limit Equal To In Specification Upper Limit: 10

Result: 10

Greater Than Out of Specification Upper Limit: 10

Result: 15

Less Than In Specification Upper Limit:

Result: 5

Lower Warning Equal To Out of Trend Lower Warning Limit: 10

Result: 10

Greater Than In Specification Lower Warning Limit: 10

Result: 15

Less Than Out of Trend Lower Warning Limit: 10

Result: 5

Upper Warning Equal To Out of Trend Upper Warning Limit: 10

Result: 10

Greater Than Out of Trend Upper Warning Limit: 10

Result: 15

Less Than In Specification Upper Warning Limit: 10

Result: 5

 

Where there is more than one specification assigned to a product, a result will be compared against all specifications:

  • If a result is within limits for all specifications, the output will be displayed as ‘in specification’.
  • If the result is within limits for only one specification, the output will be displayed as ‘out of specification’, and the tool tip will highlight the specification that the result does not meet.
  • If the result is not within limits for any of the specifications, the output will be displayed as ‘out of specification’, and the tool tip will highlight the specifications that the result does not meet.

The precision rounding determines the number of significant figures to be taken into account when comparing the result against the specification. Where precision rounding is configured for a test specification component, the result is rounded to the nearest decimal configured using the ‘round half away from zero’ method.

Once all sample data has been entered, a sample test will receive a tick in the first column to the left of the Sample ID. This denotes that the sample is ready for submission. Where no sample tests are ready for submission, the [Submit] button will remain inactive. NB: All inputs for an individual sample test, along with any mandatory additional fields, must be completed in order to allow submission of the sample test.

When submitting results, the user will be prompted to enter their password (See Figure 15 below). Once submitted, the result data is stored and the data logged against the user, date and time.

Figure 15: Electronic Signature

Figure 15: Electronic Signature

 

If an error in data entry is noted following results submission, data may be edited prior to validation via the Edit Results screen. For more information, see Edit Results.

Investigations

Starting an Investigation

Where a sample test holds an OOS result, or further investigations are required, an investigation may be raised and flagged against a sample test. To start an investigation on a sample test, right-click anywhere on the row of the desired Test ID and select ‘Investigation > Start Investigation’ from the menu. The Investigation window will be displayed (See Figure 16 below), showing the details of the selected sample, including Sample ID. On starting an investigation, the user will be required to enter a reason in the Reason field. By selecting [Start], the user confirms the raising of the investigation and a prompt will be displayed with the Investigation ID created.

Figure 16: Enter Results - Starting Investigations

Figure 16: Enter Results – Starting Investigations

Adding Tests to a Sample

Where a repeat or additional test is required for a sample, a new test can be added via the Enter Results page. To add a new test to a sample, right-click on the row of the sample which requires an additional test and select Add Tests from the menu. The Add Test Methods window will be displayed, from which the user can select from a list of available test methods to add to the sample (See Figure 17 below).

Figure 17: Enter Results - Adding Tests

Figure 17: Enter Results – Adding Tests

 

The Search field at the top of the window allows the user to type search the list using test method name. One or more test methods can be selected from the list, and where more than one test method is selected, one test will be added to the sample per test method.

Once all required test methods have been selected, select [OK] to add the tests to the sample. The user will be required to enter a reason for change after adding new tests. The newly added tests for the sample will receive a status of ‘Registered’ in LIMS, which means that they will need to be added to a workbook via Create Work or Edit Work before results can be entered.
NB: Added tests are not automatically added to the workbook selected at Enter Results. If a user wishes to add a new test to their current workbook, this can be added via Edit Work (see Edit Work – Adding Tests).
An investigation can also be started at the point of adding tests by ticking the Start Investigation prompt at the bottom of the window. Where the investigation tick box is selected, the user will be prompted to enter a reason for starting the investigation and the Investigation ID will be confirmed in a pop-up.

Trace Samples

Sample tests marked as trace items when creating a workbook can be visibly identified at Enter Results by the trace icon in the third column to the left of the Sample ID (See Figure 18 below). Where LIMS is used alongside the Stock or Equipment modules, or both, the trace sample can be used to populate stock and equipment data for all sample tests of the same test method within the same workbook. Stock or equipment assigned to the trace sample will automatically propagate to other sample tests in the results grid, however stock or equipment assigned to other non-trace samples in the grid will not affect other sample tests. It is also possible to amend stock and equipment assigned to other non-trace samples, either by adding or removing items, and this should not affect the stock and equipment assigned to trace sample.

Figure 18: Enter Results - Trace Sample


Figure 18: Enter Results – Trace Sample

Edit Results

It is possible to modify existing sample data via the Edit Results screen. This page can be accessed by selecting ‘Edit Results’ from the LIMS menu.

On initial entry to the Edit Results page, there are two tabs which can be used to locate and select sample tests which have already had results entered (See Figure 19 below). The User tab is automatically selected, displaying a list of users with workbooks assigned which have results available for modification. Selecting a user from the left pane expands the workbook list under the user’s name. All Workbook ID’s assigned to the selected user which contain sample tests with results entered will be displayed below the user’s name. Selecting a Workbook ID from the left pane then expands the test method list. All test methods assigned to the workbook which have sample tests with results entered will be displayed beneath the Workbook ID.

Figure 19: Edit Results

Figure 19: Edit Results

 

Alternatively, results can be edited for test methods across multiple workbooks using the Test Method tab. The left pane displays a list of test methods which contain sample tests with results entered. A sample test will be assigned a test method based on the version number of the test method at the point of sample registration. For this reason, each test method is displayed with its corresponding version number in the left pane.

Selecting a test method from the left pane populates the results grid on the right with associated sample tests. The results grid can display the following test details:

  • Sample ID
  • Test ID
  • Sample Batch Number
  • Product Name & Description
  • Test Date
  • Test Method Additional Fields
  • Stock (if applicable)
  • Equipment (if applicable)
  • Inputs
  • Outputs

Additional sample data can also be seen when placing the mouse cursor over the Sample ID in the results grid. The tool tip on the Sample ID will display the Study ID and Job ID that the sample belongs to.

Editable data fields include inputs, test date, stock (if applicable), equipment (if applicable) and any test method additional fields. To edit any one of these fields, select the cell and type in the new value. Where an input field is amended, all outputs calculated using this input will be automatically recalculated and the new value compared against the associated specification(s).
If any changes are made to the existing sample data, the [Edit Result] button in the bottom right of the page will become active. Selecting [Edit Result] will prompt the user to verify the change by entering their password as an electronic signature. The user will also be required to enter a reason for change, and any changes to sample data will be recorded in the Sample Audit, along with the user who carried out the change, date, time and reason. Once the results are saved, the Edit Results screen will automatically refresh and any amendments will be displayed in the results grid.

Investigations

Sample investigations can also be started, viewed or stopped via Edit Results. To start an investigation on a sample test at Edit Results, right-click anywhere on the row of the desired Test ID and select ‘Investigation > Start Investigation’ from the menu. The Investigation window will be displayed, showing the details of the selected sample, including Sample ID. On starting an investigation, the user will be required to enter a reason in the Reason field. By selecting [Start], the user confirms the raising of the investigation and a prompt will be displayed with the Investigation ID created.

Once an investigation is started on a sample test, an investigation icon will be displayed in the second column to the left of the Sample ID. The blue triangle denotes that the sample test is currently under investigation. Investigation details can be viewed by double-clicking on the investigation icon, or by right-clicking on the sample test and selecting ‘Investigation > View Investigation’. The Investigation window is then displayed with a reason and resolution (if applicable).
To stop an investigation at Edit Results, right-click anywhere on the row of the test holding the investigation and select ‘Investigation > Stop Investigation’ from the menu. The Investigation window will be displayed, showing the details of the selected sample, along with the initial reason for starting the investigation. When stopping an investigation, the user will be required to enter a resolution in the Resolution field. Selecting [Stop] confirms closure of the investigation and a prompt will be displayed stating that the Investigation ID has been stopped.

Adding Tests

To add a new test to a sample via Edit Results, right-click on the row of the sample which requires an additional test and select Add Tests from the menu. The Add Test Methods window will be displayed, from which the user can select from a list of available test methods to add to the sample.

Once all required test methods have been selected, select [OK] to add the tests to the sample. The user will be required to enter a reason for change after adding new tests. The newly added tests for the sample will receive a status of ‘Registered’ in LIMS, which means that they will need to be added to a workbook via Create Work or Edit Work before results can be entered.

NB: Added tests are not automatically added to the workbook selected at Enter Results. If a user wishes to add a new test to their current workbook, this can be added via Edit Work (see Edit Work – Adding Tests).

An investigation can also be started at the point of adding tests by ticking the Start Investigation prompt at the bottom of the window. Where the investigation tick box is selected, the user will be prompted to enter a reason for starting the investigation and the Investigation ID will be confirmed in a pop-up.

Validate Results

The specific process of data validation will depend on internal policies, however the validation process within LabHQ is designed to present the validator with the same view as the user who entered the results in order to verify that the data has been entered/transcribed correctly. Sample data can be validated by selecting the Validate Results icon or selecting ‘Validate Results’ from the LIMS menu (See Figure 20 below).

Figure 20: Validate Results

Figure 20: Validate Results

 

On initial entry to the Validate Results page, there are two tabs which can be used to locate and select sample tests for result validation. The User tab is automatically selected, displaying a list of users with workbooks assigned which require results validation. Selecting a user from the left pane expands the workbook list under the user’s name. All Workbook ID’s assigned to the selected user which contain sample tests awaiting results validation will be displayed below the user’s name. Selecting a Workbook ID from the left pane then expands the test method list. All test methods assigned to the workbook which have sample tests awaiting results validation will be displayed beneath the Workbook ID.

Alternatively, results can be validated by test method for all workbooks using the Test Method tab. The left pane displays a list of test methods which contain sample tests awaiting results validation. A sample test will be assigned a test method based on the version number of the test method at the point of sample registration. For this reason, each test method is displayed with its corresponding version number in the left pane.

Selecting a test method from the left pane populates the results grid on the right with associated sample tests. The results grid can display the following test details:

  • Sample ID
  • Test ID
  • Sample Batch Number
  • Product Name & Description
  • Test Date
  • Conclusion
  • Test Method Additional Fields
  • Stock (if applicable)
  • Equipment (if applicable)
  • Inputs
  • Outputs

Additional sample data can also be seen when placing the mouse cursor over the Sample ID in the results grid. The tool tip on the Sample ID will display the Study ID and Job ID that the sample belongs to, along with the name of the user who entered the results.

The result data, including both inputs and outputs, are displayed in the grid as read-only. Each sample test is also displayed with a conclusion, which is applied based on the calculated result. The Conclusion values include:

  • Complies – Valid
  • Complies – Not Valid
  • Does Not Comply – Valid
  • Does Not Comply – Not Valid

Where all outputs for a sample test are calculated as within specification, the conclusion assigned to the sample test will be ‘Complies – Valid’. Where one or more outputs for a sample test are calculated as outside of specification, the conclusion assigned to the sample test will be ‘Does Not Comply – Valid’. Where a sample test or result of a sample test is not valid, the Conclusion field can be amended to ‘Complies – Not Valid’ or ‘Does Not Comply – Not Valid’. If a Conclusion requires amending, the user will be required to enter a reason for change on validating the results. Any changes to the conclusion will also be logged in the Sample Audit.

Investigations can be started, viewed and stopped in the same manner as Enter Results. Similarly, the user can also add tests in this page.
Results validation must be carried out independently of the user who entered the results. If a user selects their own username and workbook from the left pane, the sample tests will be available to view in the results grid, however the user will be prevented from validating the sample tests as the tick box will not be available to select the tests for validation. To select a sample test for validation, select the tick box to the left of the Sample ID and select [Validate]. The user will be requested to enter their password as an electronic signature and the action will be recorded in the sample audit trail.

Review Results

Review Results is an additional step to check data by an independent user. This can be configured into the workflow. To access Review Results, select Lims > Results > Review Results or select the Review Results icon in the right hand corner. On entry to the Review Results page, data appears in the same manner as Validate Results. All data displayed is read-only, except the user can amend the Conclusion. Where this value is changed, the user will be requested to enter their electronic signature and reason for change upon saving.

Approve Samples

Sample approval is the stage in which results are made available to the client. The specific process for approval will depend upon internal polices, however the approval process within LabHQ is designed to allow the approver to preview the final report and produce a statistical summary of the dataset available for approval. Sample data can be approved by selecting the Approve Samples icon or selecting ‘Approve Samples’ from the LIMS menu.

The left pane displays a list of studiess which contain samples currently awaiting approval. Not all samples in a study must be validated for the study to appear as ready for approval. Where a study contains at least one sample which is awaiting approval, the study will appear in the left pane at Approve Samples. The study list displays the Study ID, Batch Number and associated Product for each study. Selecting a study from the left pane populates the right hand grid with a list of jobs and samples associated with the selected study.

Figure 21: Approve Samples

Figure 21: Approve Samples

Test results are displayed in a gird format to enable visual trending of the results obtained. Placing the mouse cursor over one of the outputs will display the specification tool tip which allows the user to see the specification applied to the output and any associated target or limit values (See Figure 21 above). The specification will also be flagged in red where a result is outside of specification, or blue where a result sits outside of warning limits. The results grid also displays any test method additional field entries.

Samples can be selected by ticking the check box next to the Sample ID. Where one of the samples in a job is not yet ready for approval, the tick box will be hidden so that the user cannot approve the sample before it has been validated. When a sample is selected, the following options are available:

  • Trend

The Trend button outputs a statistical summary for the selected samples and test method outputs to assist in detecting outlier. Where there are multiple versions of a test method, the test method outputs will be displayed along with a version number. Where test method data has change significantly between versions, it is recommended that trending is carried out separately per test method version. NB: The trend report is only applicable for numeric values

 

Additionally, two buttons next to the Sample ID allow for better quality control. The following buttons are available:

 

  • View Sample Certificate

A preview of the sample certificate is generated so the user can see all data associated with the selected sample before approving it.

 

  • View History

The View History button generates an audit log which displays a list of changes made to the selected sample during the sample lifecycle so that the approver can review the changes before approving (See Figure 22 below).

Figure 22: Sample Audit Trail

Figure 22: Sample Audit Trail

 

The Conclusion of each sample is displayed to the right of the Sample Name. The conclusion is calculated based on the worst case conclusion of the sample tests belonging to the sample, therefore:

  • Where all sample tests received a status of ‘Complies – Valid’ following validation, the status of the sample will be assigned as ‘Complies – Valid’ at Approve Samples.
  • Where one of the sample tests holds a status of ‘Complies – Not Valid’, the status of the sample will be assigned as ‘Complies – Not Valid’ at Approve Samples.
  • Where one of the sample tests holds a status of ‘Does Not Comply – Valid’, the status of the sample will be assigned as ‘Does Not Comply – Valid’ at Approve Samples.
  • Where one of the sample tests holds a status of ‘Does Not Comply – Not Valid’, the status of the sample will be assigned as ‘Does Not Comply – Not Valid’ at Approve Samples.

The table below shows some examples of how sample conclusions are assigned in LabHQ.

Sample Test Conclusion Sample Conclusion
Test 1 Test 2 Test 3
Complies – Valid Complies – Not Valid Complies – Valid Complies – Not Valid
Complies – Valid Does Not Comply – Valid Complies – Not Valid Does Not Comply – Valid
Does Not Comply – Valid Does Not Comply – Not Valid Complies – Valid Does Not Comply – Not Valid

 

If required, these conclusions can be changed to any of the other options. If a conclusion is changed, the save button will become active and the user will be prompted for a Reason for Change either upon saving or upon approval.

When one or more samples are selected, the [Approve] button will become active. Selecting [Approve] approves all sample data and the study is removed from the left hand pane. Upon approval, the user is required to enter their password as an electronic signature to confirm the action.

Unapprove Samples

Samples can be unapproved through the Unapprove Samples dialogue which is accessed through Lims > Samples > Unapprove Samples. When opened, the Unapprove Samples dialogue will displayed and allow the user to enter an sample ID. If the sample entered has not been approved, the user will receive a warning. If the sample entered has been approved, the user will receive conformation and the sample will be unapproved.

The study/sample will be available at Edit Results, so the required changes can be made, and Approve Samples so it can be re-approved.

Figure 23: Unapprove Samples

Figure 23: Unapprove Samples

Unvalidate Results

Results can be unvalidated from the Unvalidate Results page which is accessible from the Lims menu. Results will appear in this page where they have been unreviewed. To Unvalidate a result, select the tick box on the sample row and select [Unvalidate] (See Figure 24 below). A reason for change and the user’s password will be requested as an electronic signature to confirm. This action will be recorded in the sample audit trail.

Figure 24: Unvalidate Results


Figure 24: Unvalidate Results

View Studies

An overview of studies is displayed on the View Studies page. This page can be accessed from the Lims menu. The following data is displayed for each study in the grid (See Figure 25 below):

  • Study ID
  • Study Status
  • Study Description
  • Batch Number
  • Product Name
  • Product Description
  • Manufactured Date
  • Person Responsible
  • Created By
  • Created Date
Figure 25: View Studies

Figure 25: View Studies

 

The studies displayed are filtered by Client, and the client can be selected from the dropdown at the top of the table. The studies displayed are also filtered by date range. The default date range is one month previous, and the user can widen or shorten this window using the date pickers provided. For full flexibility, the user can also type search the studies using the search bar in the right hand corner.

The edit icon will be available for studies not approved. When the edit icon is selected, the Edit pane appears on the right hand side (See Figure 26 below). The following information is displayed and editable except for Study ID and Product:

  • Study ID
  • Batch Number
  • Study Description
  • Product
  • Manufactured Date
  • Person Responsible
  • Notes
Figure 26: View Studies - Edit Study

Figure 26: View Studies – Edit Study

Jobs within a study can also be viewed by selecting the tick box of one or more studies and selecting [View Jobs]. Where more than one study is selected, all jobs for the selected studies will appear.Studies can be removed by selecting the tick box next to the Study ID. Only one study can be removed at a time.

View Jobs

An overview of jobs is displayed on the View Jobs page. This page can be set as a user’s dashboard and can also be accessed from the Lims menu. The following data is displayed for each job in the grid (See Figure 27 below):

  • Job ID
  • Job Status
  • Job Description
  • Study ID
  • Batch Number
  • Product Name
  • Product Description
  • Created Date
  • Contracted Completion Date
  • Actual Completion Date
  • Job additional fields as configured on the Client
Figure 27: View Jobs

Figure 27: View Jobs

 

The jobs displayed are filtered by Client, and the client can be selected from the dropdown at the top of the table. The studies displayed are also filtered by date range. The default date range is three months previous, and the user can widen or shorten this window using the date pickers provided. For full flexibility, the user can also type search the jobs using the search bar in the right hand corner.

The edit icon will be available for jobs not approved. When the edit icon is selected, the Edit pane appears on the right hand side (See Figure 28 below). The following information is displayed and Job Description is the only editable field:

  • Job ID
  • Job Description
  • Product
  • Job additional fields as configured on the Client
  • Job additional fields as configured on the Product

Samples within a job can also be viewed by selecting the tick box of one or more jobs and selecting [View Samples]. Where more than one job is selected, all samples for the selected jobs will appear.

Figure 28: View Jobs - Edit Job


Figure 28: View Jobs – Edit Job

View Samples

The View Samples page can be accessed from Lims > Samples > View Samples. Alternatively, the user can view samples by drilling down from the View Jobs page. The following data is displayed in the grid at View Samples:

  • Sample ID
  • Sample Status
  • Sample Name
  • Job ID
  • Study ID
  • Batch Number
  • Product Name
  • Product Description
  • Test Suite
  • Received Date
  • Completion Date
  • Sample additional fields as configured on the Client

The user can click on the [+] icon next to the tick box on a sample row. This will display the test methods assigned and test results where they have been submitted for the sample (See Figure 29 below). The test specification for each test method can be displayed when the mouse is hovered over the result.

Figure 29: View Samples

Figure 29: View Samples

 

Editing Samples

Where a sample has not been approved yet, the user can edit the sample details. If the sample holds a status post-register, a prompt will appear to confirm the user is OK to proceed as the sample is under analysis. When the edit icon is selected, a pane appears on the right hand side (See Figure 30 below). The edit pane displays the following information and the sample ID, job ID, sample name and product name is not editable (* denotes field appears in post-register status).

  • Sample ID
  • Job ID
  • Sample Name
  • Product Name
  • Batch Number
  • Comment
  • Test Methods*
  • Contracted Completion Date*
  • Scheduled Completion Date*
  • Number of Labels*
  • Sample additional fields as configured on the Client
  • Sample additional fields as configured on the Product
Figure 30: View Samples - Edit Sample

Figure 30: View Samples – Edit Sample

Where the user saves a change to the sample, a reason for change and their password as their electronic signature will be requested and recorded in the sample audit trail.
The user also has the ability to print a Quality Report, Trend Report, Summary Certificate and Sample Certificate.
The Quality Report and Summary Certificate displays a comparison of test results across the selected samples. The trend report illustrates the test results in a line graph, table and histogram. The Sample Certificate displays the test results for one sample at a time in a professional format in comparison with a selected specification.

Investigations

Investigations can be started, viewed and stopped for a sample. This functionality is accessible from the Enter Results, Edit Results, Validate Results and also the Investigations Pages. To access the Investigations page, select Lims > Samples > Investigations. A list of studies will be displayed in the left hand pane and is searchable by Study ID, Batch Number, Product Name and Client Name. When a study is selected in the left hand pane, all test IDs associated with the samples and jobs within the study will be displayed (See Figure 31 below). The information displayed in the grid is as follows which is grouped by Job ID – Sample ID:

  • Test ID
  • Test Method Name
  • Investigation ID
  • Batch Number
  • Start Date
  • Started By
  • Starting Reason
  • Close Date
  • Closed by
  • Closing Reason

To start an investigation, right click on a test ID row and select ‘Start Investigation’. There cannot be more than one investigation open on a sample test. A window will pop-up for the user to enter and save the start reason. When the investigation has started, a blue triangle will appear on the row. To View the investigation, click on the blue triangle. To stop the investigation, right click on the row and select ‘Stop Investigation’.

Figure 31: Investigations


Figure 31: Investigations

Job Documents

Electronic documents or files (e.g. Word Document or picture) may be uploaded to LabHQ against a specific job. The time taken to upload documents is dependent on their size and your internet connection speed. It is recommended that files do not exceed 2MB.

To upload a document or file, select the Job Documents option from the Lims menu. The left pane displays a list of available jobs in LabHQ, along with their Batch Number, Product, Job Name and Client (See Figure 32 below). Select the desired Job ID from the list to view any existing documents or files uploaded against the job. To upload a new item, select the [Add] button from the bottom right corner. The file browser will then be opened for the user to locate the appropriate file. To begin uploading the file, select [Open], or to preview a file before uploading, right-click on the file name and select ‘Open’ from the menu. NB: The Documents list in LabHQ will remain empty until the upload has been completed.

Once the file has been successfully uploaded, a prompt will be displayed to send an email notification to the respective client. Select [Yes] to send an email to the client, or [No] to exit.
Following upload, the Job Documents screen will then be displayed again with the new file listed. Additional files may be added by selecting the [Add] button, or existing files removed by highlighting the appropriate file from the list and selecting the [Delete] button. If required, the file may be downloaded to an available directory (e.g. C: drive) by highlighting the document and selecting the [Download] button. The file browser will be opened for the user to select a location and save the file.

Figure 32: Job Documents


Figure 32: Job Documents

Retire Jobs

Jobs can be removed or retired from the Retire Jobs page. To access this page select Lims > Jobs > Retire Jobs. A table displaying the following job details will appear (See Figure 33 below):

  • Job ID
  • Description
  • Study ID
  • Batch Number
  • Product Name
  • Product Description
  • Client
  • Created Date

To retire a job, select a row and select [Retire]. A reason for change will be requested and the removal of this job will be recorded in the change log.

Figure 33: Retire Jobs


Figure 33: Retire Jobs

Leave a Reply

Your email address will not be published. Required fields are marked *

Post comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.