7 Replies

  • Raven5756 wrote:

    I have to come up with a software validation procedure that explains how we validate software before it goes into production

    It depends on what you mean by one software validation procedure. This is fine if it is sufficiently abstract. But your procedure needs to address variants due to different risk profiles. So it is a matter of consideration if you have one or several procedures. You'll need to document the relation of those variants into the encompassing procedure. This can be presented as a tailoring and custimization for the various software and risk profiles, e.g. in an approved report.

    Raven5756 wrote:

    and corresponding testing records. 

    Sounds that there is something missing in between.

    It's already several year ago, I had to address ISO 13485. I wasn't in charge of that duty. I was in charge of software configuration management. And there are implications of ISO 13485 on software configuration management too. I was working as a temp for a customer in the medical device manufacturing industry, in a project with invasive devices, hence highest risk evaluation requirements. The customers risk manager wasn't a help to my part of the task as he was only able to evaluate medical risks although regulation requires a broader view. And there was no risk manager addressing other risks. Preceeding project versions have used spreadsheets to list all software tools deployed with various additional infos and links. This included the result of risk categorization, supplier, and various other infos. It did not include any test records. I wouldn't consider it manageable if all test records would be included. The sheet was able to provide an overview what had already been evaluated with which result and where to find further detail infos. But I guess that an appropriate database application would be more suitable.

    • Which tools are you using for your test management?
    • Which tools are you using for risk management?
    • Don't these tools include components suitable to do what you're looking for?

    As far as I remember some tool vendors offer a kind of profile set for their tools to customize their tool set to the needs of medical device regulation and standards. That last customer used such tools but didn't purchase these profile sets although offered by these tool vendors. Some of these tools sets can be considered database applications.

    Raven5756 wrote:

    The procedure I have done but I'm struggling to come up with a template to use that encompasses everything that we have to provide for records (essentially, proof that we tested  and documented EVERYTHING).

    That sounds incomplete to me.

    • What's your setup of the approval games and how do you report these approvals by authorized, required and optional parties resp. stakeholders?

    For every software, you'll end up with a set of test results compiled into a test report and balancing against your test plan with an appropriate recommendation for either further testing or recommendation of meeting planed and specified accepted risk levels. And your procedure describes how such reports will be used to accept the remaining risks by which stakeholders represented by which persons in which qualification. This approval is part of the records you'll need to provide to auditors in case of next audit.

    And no, you don't need to document everything. But your preparations will records what doesn't need to be recorded for which reasons. I don't remember which standard was addressing which part of the risk assessment and quality management aspects for medical devices. But I remember that when risk assessment proofed low risk, this assessment needs to be documented. But this implied that many other documentation needs were not needed for such a low risk software.

    Raven5756 wrote:

    We've always tested software but to various degrees depending on how big the software was and there was many times we didn't record the testing. 

    It isn't the size of software which determines its testing needs. It's the risk assessment and the intended use which determine such needs. Organisational or technical constraints for enforcing such use and preventing other uses may be relevant for such risk assessment. So testing records are one aspect, risk assessment records another one. The testing records are depending on the risk assessments. They need to recorded too. Revision safety is important for your records too, including approval records.

    The involved standards provide models for testing, classification, approval, ... And tool providers offer templates meeting such constraints. The application domain of ISO 13485 uses a simplified risk model. And as far as I remember, there is another ISO standard with such risk models as you'll need to map the diversity of software and its use to the models for your application and devices. And that other standard has been revised and split up into two since my last use.

    In another project, I was involved with system integration and system testing, in another domain with highly more complex risk model which risks to many thousands of human lives per device/plant. There we used a test management tool, HP Quality Center. This provided much of needed tool support. Recordings were in its underlying databases. And it offered many reporting options. In the mean time, this tool set got renamed to Micro Focus Quality Center after change of ownership and part of its Application Lifecycle Management solution.

    So have a look at what you already have. And either there should already be included what you're looking for or it as available as an add-on of the same supplier, providing more customizations to your business domain than just such templates or reports.

    Spice (1) flagReport
    Was this post helpful? thumb_up thumb_down
  • I have to do this on the regular, so much so that I have Computer Systems Validation Engineers on staff. I know exactly what you're talking about and it does seem daunting. Our testing is based on how the system will be used and a risk assessment we perform.

    Your testing will not always be the same, but it will cover some key areas like Access Control, Power Failure handling, Backup/restore procedures/execution, Permission Levels (if applicable), network testing, loss of network testing, testing critical use functions of the software etc.

    We have 3 different Qualifications we do: Installation Qualification (IQ), Operational Qualification (OQ), and Performance Qualification (PQ). We write our Operational Qualifications (OQ) in a table format broken out by sections that aligns with our Trace Matrix so we can map what test links to what configuration section in our Configuration Specification (welcome to Pharma).

    We write the test section, if there are any pre-reqs to do the test such as needing a user account with a specific access level, how to perform the test, what the expected result should be, then we leave a box to record if the result was acceptable, then we capture the result either with a picture, screenshot, or have two people initial to confirm that the result was 'as observed'. If it's a picture or screenshot we then footnote what the attachment number is for that test script (fyi - by script I just mean instructions on how to do the test, not an automated file/program we run). I've attempted to attach an example of one of our test scripts.

    attach_file Attachment OQexample.jpg 144 KB
    Spice (1) flagReport
    Was this post helpful? thumb_up thumb_down
  • scheff1 Thanks for the reply.  We are using an outside consulting firm that provided us with template procedures for documentation and verification of software.  As part of the procedure template they provided - they left it pretty vague and seemed more concerned that we keep documentation of what we decided to test, if anything.  I also did some research on how to assign risk assessments so I feel like I have a good idea about how to assign that as well.  What I was looking for was more along the lines of examples of testing checklists.  Something maybe to help encompass both the validation plan and any testing that was done.  

    Was this post helpful? thumb_up thumb_down
  • Manwiththeforce​  Thanks for the reply,  This helped a lot.  You listed areas of testing that I have never thought to test before so I will be adding them to my list!  Thanks!

    Spice (1) flagReport
    Was this post helpful? thumb_up thumb_down
  • Raven5756 wrote:

    What I was looking for was more along the lines of examples of testing checklists.  Something maybe to help encompass both the validation plan and any testing that was done.  

    The afore mentioned Quality Center solution is a test management solution, not specific to a particular business domain. With it, I created test plans and testing checklists as well as reports. I used it with an integration to/from requirements management. There exist integrations which may already generate proposals for test plans and testing checklists.

    For risk assessment, I had no tool support. I was using ISO 14971:2012. This standard has been revised since into ISO 14971:2019 and a split off of its informal part into ISO/TR 24971:2019. And it was this informal part that I proposed to use for selecting a risk model and as a constraint to validation, including to determine granularity for validation.

    Was this post helpful? thumb_up thumb_down
  • scheff1​  Oh ok,  I must have misread the brochure - I didn't realize it was a software in itself.  I will take a look at it.  Thanks!

    Was this post helpful? thumb_up thumb_down
  • For software validation I split mine into:

    Off the Shelf software where I ask the supplier to perform IQ, provide templates for OQ and we perform PQ on similar templates with our product(s)/tests. 

    In house written software (Excel/Access) for EQR, SOP and Instrument Data Capture where validation is based on the risk and nature of the software. I give our R&D/QC Staff a template prompting for definitions of test, desired and actual outcomes. 

    I found the FDA Website helpful: 


    Spice (1) flagReport
    Was this post helpful? thumb_up thumb_down

Read these next...