[Q&A] Maximize the Value You See from Research Systems with these 3 Success Factors

April Schultz
October 2nd, 2018

During our recent webinar, Maximize the Value You See from Research Systems with these 3 Success Factors, Wendy Tate, Director of Analytics at Forte, discussed the top three practices identified by our survey respondents as critical to the success of systems at their organization: a designated point of contact for the system, internal communication and system training. Here, Wendy answers attendee questions about what to expect from your system point of contact, when and how to mandate training, and more. 

 If I designate a point of contact (POC) for my organization, what should I expect of them? Can that one POC be used for multiple systems?

Utilizing a point of contact (POC) for multiple systems depends on several factors, including other job requirements, system complexity, system usage and number of staff that person is supporting. It may be possible that there are two POCs – one that handles the IT portion of the system and one who understands the implementation and use of the system. I will focus on the latter POC. In this case, it is very important that the user understands how the system is utilized within the workflow of the organization. I would expect that this person understands how data flows into and out of the system.

The roles of a point of contact could vary greatly depending on the needs of your organization. Regardless, your POC should have good communication skills with the vendor, organizational leadership/administration, and the organization’s users. It is important that they communicate use of the system. They should be a proponent for system usage and relate a message of positivity about how that system helps productivity at both the user and organizational levels. They should also listen to users’ issues and help route those concerns to the leadership and/or vendor, as appropriate.

The POC should also be heavily involved with training and re-training of the system. As the POC, they should know the most recent changes to the system and indicate whether organizational changes need to be made. They should have a meaningful voice with the organization’s leadership on how/if system changes are made. With end users, if they are not actually providing the training, then they should at least assist with the development of materials, training the trainers, and be available physically or virtually to all new trainees for questions about system usage. I would expect this person to be very visible in the communication of the system, both at the end-user level as well as the administration level.

This person may also be involved with reporting from the system. In this case, an understanding of the use cases for the metrics and where that data is found is vital. They should understand how to process the information effectively and accurately. It is not required for the POC to actually run these reports and generate the metrics, but if they do not, they should be able to communicate to the person analyzing the data where to find the proper data. In addition, it is important to know how the system interacts with other systems. Systems do not act in a vacuum (usually), and the POC shouldn’t act like they do. If they are the POC for multiple systems, they should understand how the systems work together (or do not) and how processes interact with the systems. If they are not the POC for all of the systems being discussed/examined/utilized, they should know who the POC for the other system is, communicate with that person about major changes, and know basic interactions between how their system works with the other system(s).

Being a POC takes effort, so if one person is designated as the POC for multiple systems, adequate time should be set aside to allow that person to have the proper interactions with vendors, users, systems administration and leadership to ensure the systems are being utilized in the best way possible.

 How frequently should we mandate training and re-training on systems?

Several factors should go into the consideration of training frequency. If your staff can pass a training quiz without reviewing the material, it should be considered whether re-training is worthwhile (I would argue that it isn’t). Mandated training and re-training should serve a purpose. It can be used to ensure that the most up-to-date features are being utilized, it can ensure consistency of system use across the organization, and it can demonstrate compliance. Initial training and competency testing should be performed on critical systems. This will document not only system use, but organizational procedures and practices that are required to maintain high-quality research.

It is best practice to maintain methods for staff to voluntarily re-train, either through regular offerings of training (perhaps you can allow staff to attend new hire trainings as a re-training/refresher option) or through on-demand resources (online manuals and trainings). If a major transformation of product use has occurred, for example, a new feature that will impact organizational workflows, redesign of the system, etc. occurs, that definitely warrants mandatory re-training. Also, if there is a compelling regulatory reason, for example, to show compliance with Good Clinical Practice, regular mandatory re-training is warranted. I would support the notion that if compliance with regulatory requirements is the sole reason for re-training, it should be no more frequent than annually and contain the minimum necessary to show compliance.

 Do you have recommendations about how to incorporate all of the electronic training and retraining that is required in order to conduct a study (eCRF, Lab training, etc.) into a budget? 

There are several ways that this could be done. One method is to apply a straight percentage, overhead rate that covers administrative work on a protocol (training, administration, utilities, “keeping the lights on”, etc.). This would then be applied to all sponsored protocols to help cover the non-procedure costs associated with the research. Be prepared to justify this to the sponsor, as they will likely question why a percentage across the board is being applied. Another method is to do a quick time/effort study on training. Calculate how long it takes for one person to do an average training for a protocol. Then, this “hourly rate” of training can be added to the protocol for the number of personnel expected to work on the protocol. This line-item may be more palatable to sponsors, but also could be subject to mis-estimation for actual time and effort spent. Remember, that there isn’t only protocol-specific training to consider but other job-relevant trainings on systems that are used for a study, but not study specific (e.g. CTMS) as well as other research trainings (e.g. CPR, HIPAA, IRB, ethics trainings). Some organizations may decide these overall “infrastructure” trainings should/can be covered by institutional funds, so only the research-specific trainings would be budgeted back to the sponsor.

Want more answers?

Watch the free, on-demand webinar, Maximize the Value You See from Research Systems with these 3 Success Factors, to learn more best practices to improve the perception of technology among your research staff and increase the value you see from your research systems. Watch the recording today!