Articles tagged with 'TrialGrid'

Form Editor Improvements

When we started in 2016, Andrew and I had two main ideas for Medidata Rave: 1) Diagnostic Quality Checks and 2) Infix Edit Check editing. We didn't set out to create a better Architect but we needed a way for users to view and correct issues in study design objects identified by Diagnostics.

We introduced our Form Editor in early 2017 with some nice innovations for cloning Fields and moving them around, copying View and Entry restrictions and various other improvements over Architect:

Original Form Editor

Recently Andrew worked on further improvements and a few weeks ago these were made available on our Beta site.

Form Preview

We tried to make our Editor look a lot like the Form as it is displayed in Rave but our users were asking for improved preview and for Rave EDC (RaveX) preview. This is now available for Rave Classic:

Preview Classic

And for Rave EDC (this one shown in full-width mode):

Preview RaveEDC

We're particularly excited about the RaveEDC preview since, at least at the time of writing, this is a capability that Rave Architect doesn't provide.

View and Entry Restrictions

Keeping track of View and Entry restrictions for Fields can be fiddly. In Architect and in our original Form editor these were separate lists but Andrew has now put them side-by-side which makes it a lot easier to see whether a user is View or Entry restricted to a Field. Really you only want one of these checked, if a user is View restricted (they can't see the Field) then they are implicitly Entry restricted!

Editing Restrictions

In this screenshot you can also see an indication of the Form-level restrictions. Where a Field is restricted at the Form level there's no point in also restricting it at the Field level.

These changes to the editor also preserved the copy and paste functionality so if you have a block of Fields which should have the same restriction settings you can copy them from one Field to another. This can save you a lot of time and reduce errors.

Previewing Restrictions

The new Form Editor also offers the ability to preview as a particular role which makes it easy to check restrictions settings. Here I'm previewing a Form as the Batch Upload user and I can see that the first Field is View Restricted to me (it appears greyed out)

Previewing Restrictions

Summary

These are just the highlights of our new Form Editor. Andrew has also done a lot of work to improve the look and feel and the use of visual space to make designing and editing Forms in TrialGrid a really friction-free experience. We hope you'll like it and as always we want to hear new ideas for improvement of the TrialGrid system!

Draft and Project Metadata

Medidata recently celebrated its 20 year anniversary, an amazing milestone, congratulations to all our friends and colleagues at Medidata!

Rave Architect, the Form and Edit Check design part of Rave, isn't quite 20 years old and in fact it still looks remarkably fresh, but nearly 20 years after it was first created, the needs of its users have grown. Some Rave installs have been running for more than a decade, accumulating hundreds of studies and reaching a scale that, I'm sure, would surprise the original designers of Architect.

That scale brings with it organizational challenges. Teams of 2 or 3 study builders in a single location have become departments of fifty or more study builders spread across different continents and timezones. Which projects are active? Which projects belong to which therapeutic areas? Which are the Phase III or Phase IV projects? Rave Architect doesn't have the ability to capture that Metadata. The Project listing in Architect gives you two useful attributes: The name of the Project and whether or not the Project is Active.

Architect Projects

This isn't a criticism of Architect; It's a tool designed to help you with the work of building and publishing EDC studies. It isn't a project planning tool, a team coordination platform or a Metadata repository. As an organization that uses Architect you need to provide those things.

As a result Organizations have tools for Project planning, technical document management, specifications management, UAT findings tracking, programming review and many more - some of them use commercial software but many of them are home-grown using spreadsheets and shared drives.

Our goal at TrialGrid is to provide a single, integrated environment for the nuts-and-bolts work of the study build but also the tracking activities that go on around it. As every Clinical Programmer knows, the job isn't done when you create a Form or an Edit Check. You also need to update a system or a spreadsheet to indicate that you've done that work and that it's ready to be reviewed by a technical reviewer, a standards manager, a tester or a manager. Life would be so much easier if there was a function in the study build tool that would perform that step.

That's why in TrialGrid we provide Labels that can be applied to any study design object to signal a workflow state (e.g. Ready for Review) or to provide informational tags (e.g. IxRS Integration) and Custom Metadata that can be applied to Forms and Fields in the study design to capture additional information (e.g. Form Standards Version, SDTM Annotation, E2B field relation, SDV tier).

And this week we added the ability to apply labels and custom Metadata to Projects and to Drafts. Allowing you to get a better overview of your Projects and to filter or search the list:

TrialGrid Projects

Just like Labels and Metadata on other object types, you get to choose the names and colors of Labels and the additional Metadata you want to capture. If it's Metadata related to study design objects such as Forms, Fields or the Draft itself then it is automatically exported to and imported from ALS files. This is useful if your source of ALS files is not Rave Architect but some kind of Metadata Repository - now some of that useful Metadata can be transferred along with the study design where it's helpful for Clinical Programmers.

We have further plans for this kind of metadata. Stay tuned!

If you would like to get future updates about this and other new features in TrialGrid, please use the subscribe button above to subscribe to our newsletter.

Tracing Bug Fixes to the code

One of the key things that auditors want to see in validation documents is traceability between features, tests of those features and the evidence that the tests were executed. Normally this is presented as a requirements specification, a set of test scripts which reference the specification and records that the test scripts were executed. For product management and for an auditor this provides assurance that you planned what you wanted to create, you developed acceptance criteria and you tested the results.

At TrialGrid we record our specifications as Issues in a GitLab repository and we reference the unique identifiers for these issues in our automated test scripts and in commit messages. I outlined this in our recent blog post. For this discussion the important thing to know is that for every Issue we have a summary page in our validation portal which pulls data from GitLab to display a description, history and links to requirements tests for that Issue.

What about traceability between bugs and bug-fixes?

We also use Issues to track bugs. Some bugs have a visual component and we will include tests that these issues are fixed as part of our automated requirement test scripts. These "Bug Issues" then have tracability from the Issue to the tests that show they are fixed just like a "Requirement Issue".

However, not all bug fixes have a visual component so we would end up with some Issue pages with no hyperlink traceability to any proof that the bug had been fixed and tested.

Is this a big deal? In my experience, organizations don't provide this kind of traceability. Bug fixes are treated separately to requirements and references to bugs fixed go in the release notes. There may be a unit test that proves the bug was fixed but maintaining a traceability matrix between unit tests and bugs fixed is more work than anyone is looking for.

Still, it bothered us that there was this lack of traceability in our validation docs. In our unit testing code we would make reference to an issue. For example:

1
2
3
4
5
# issue 1359 
def test_result_suffix_inactive_field(self): 
    """Diagnostic should generate one result for suffix when field is inactive""" 
    self.params["ignore_inactive"] = "false" 
    ...

With a comment like issue 1359 we were making reference to the issue - what we needed was a way to find these kinds of references and match them up with the pages in the validation docs with a hyperlink to the code.

Code is data

Our source code coverage analysis system already generates hyperlinked code listings which we include in our validation package. These listings allow us to create hyperlinks which will take the reader to an exact line of a particular file. What we needed was a way to read all our source code, identify the issue comments and make an association between an issue and the file and line where the issue is referenced.

Happily for us, python has good support for analysing source code so we were able to get this working in just a few hours. The end result is hyperlinks from an issue page:

Issue Summary

to the source code that exercises the issue:

Source Code Link

Summary

We don't expect that auditors will want to review the source code of unit tests but we think that providing traceability to bug fixes is useful and meets the needs of auditors who want to see evidence.

We're not yet done with improving traceability in our validation package. More on our ongoing efforts in a future blog post.