Behavioral Health Workforce Education Training Grant Webinar

Behavioral Health Workforce Education Training Grant Webinar



welcome to the health workforce technical assistance centers webinar series this webinar is entitled behavioral health workforce education training grant webinar and it was presented by Beth Phoenix and Jennifer mattone on July 10th 2019 my name is David Armstrong and today's webinar focuses on best practices for program evaluation joining me is Jessica boki in Moorea gazer from the behavioral health workforce Research Center well after today's presentations you may ask questions using the chat panel on your screen also when the event ends you will be directed to a short evaluation survey so please take a few moments provide us with your feedback now that said Jessica would you like to go ahead and introduce today's speakers I'm Jung Maria will actually thank you very much sorry about that um so um we are pleased to introduce today's speakers dr. Beth Phoenix and Jennifer are Jennifer mattone structure Phoenix is vice chair of the department of community health systems and health sciences clinical and a clinical professor at the UCSF School of Nursing dr. mattone is an assistant professor of school psychology in psychiatry and also associate director of psychosocial research and primary care in the Department of Child and Adolescent Psychiatry and behavioral sciences at the Children's Hospital of Pennsylvania so we have just a few housekeeping things to take care of before we get started first this webinar will run for approximately 30 minutes we will be recording the session for those who are unable to attend today and the recording will be made available on the health workforce Technical Assistance Center website as well as the pH WRC website and on the pH WRC social media accounts next week um in second we'd love to hear from you today during today's presentations so if you have a question for either of our speakers feel free to send it through the chat functionality and we'll be in ten questions at the end of today's session and if we're unable to get to your question during today's webinar we will follow up with you via email so we'll open today's session with dr. mattone who's going to share some of these strategies that change us to evaluate her be with training program successes and challenges and student training performances so both of our speakers will help us answer some commonly asked questions is out best practice strategies and evaluation tools using their be what program experiences so without further ado we'd like to welcome our two presenters today and we'll open with document own great thank you very much I'm really excited to be here and to share some of our our information about how we've gone about evaluating our training program in our integrated primary care service I do want to just mention my two colleagues dr. Billy Schwartz and dr. Tom power have both been other critical leaders in the development and implementation of this project but they're both not able to be here with us today so I'm going to be speaking for all of us so we're going to talk a little bit about how the context of where our training program exists in terms of our the whole Hospital health system and then I want to talk specifically about our project training goals including for program and curriculum development broadly and then how we implement and evaluate particular experiencial training to determine how our trainees are meeting their competencies and then we'll talk about our program evaluation plan and share some of our recommendations for best practices as you move forward evaluation of your own kind of similar training programs and so this just illustrates for the multiple levels of behavioral health service delivery the way that we conceptualize it and our trainees rotate through the base of that pyramid in our healthy mind healthy Kids program which is the integrated behavioral health and primary care service at Children's Hospital of Philadelphia or what we call chop and as is the case with integrated care services nationally our goal with healthy minds healthy kids is to support patients and families early hopefully to alter the trajectory of problem development and to improve access to behavioral health service delivery so we're serving as consultants to patients families and primary care providers this is often a new way for many of our trainees to be working as behavioral health providers so we do not actually expect that trainees come into our training program having had experience in primary care although many of them do so we have trying to use in psychology child adolescent psychiatry on clinical social work who rotate with healthy healthy kids attending providers and all three disciplines they work predominantly in community and sorry primary care practices that serve high proportion of patients who are Medicaid insured living in the city of Philadelphia although our healthy month health and Kids service spans the Greater Philadelphia region more broadly our trainings are focused on service delivery to urban underserved communities all of the patients we see our patients of the primary care practice where we're embedded and you can kind of see a little bit of an overview about how we provide our service we strive for a fully integrated model where behavioral health providers and primary care providers work collaboratively to support our patient and so to get more specifically at our B with training grants program goals and our evaluation plan here you can see our project goals that were specifically focused on program development so our psychology internship program here at chop is accredited by the American Psychological Association and it has several sub specialty tracks one of which focuses on integrated behavioral health in primary care before this training grant was funded we had one department funded intern on that track we have had many years of funding from graduate psychology education or the GPE program through hersa as well and with those grants we've been able to have other trainees but as that funding ended again we were able to get one department funded in turn so we're really excited that this dewitt grant has allowed us to expand our capacity up to four trainees in the last year and over the course of this project were prepared to expand is up up to as many as five or six and then hopefully continue to work on our sustainability plan to cut two increases the number of departments funded roles so that our program can sustain that larger size in addition this grant has really increased our focus on interprofessional practice and integrated primary care so that we are really focused on strengthening the integration of our training program in psychology psychiatry and clinical social work with a specific focus on integrated primary care competencies I'm going to discuss that and a little bit more detail on the next slide finally I want to call your attention to our goal related to trainee diversity we here at chop have had a really successful track record of recruiting trainees of diverse backgrounds and we are continuing to defunked to refine our recruitment efforts so that we can increase the diversity of our training classes with the hopes of increasing workforce diversity and behavioral health more broadly okay and so here you see our conceptualization of the domains of interprofessional competence that we consider to be especially important for the trainees in all three disciplines rotating and our integrated primary care service we recognize that each behavioral health discipline has unique competencies that they must master during their training so that's sort of the part of the Venn diagram it does not overlap and are within discipline supervisors address those training areas with each trainee over the course of their time with us in their training program in addition we consider these overlapping domains the six that are on that the rectangle there that are not discipline specific but that are more integrated primary care specific – and we address those in our experiential and didactic training on this track my colleagues and I published a paper related to this work in clinical and I'm sorry in child metal s and psychiatric clinics of North America and the authors there for those of you who might be interested in more details about how we kind of outlined these six training competencies and so then with all of that considered and our our goals detailed as as we just discussed and we started to think a little bit more about our evaluation plan and then how we'll proceed over the course of the project and so I'd like to spend the rest of my time focused on the program evaluation plan and best practice suggestion at the time of our project development our team developed specific aims and identified process and outcome measures for each aim we felt that it was particularly important to include a multi method evaluation plan including objective and subjective measures to ensure the collection of really rich data additionally we're utilizing an iterative process of development that includes focus groups surveys and individual interviews with trainees and faculty for our aims related to curriculum development in particular to ensure that we're really reflecting the feedback of all of our stakeholder groups in the department and in our training program so next I'm going to kind of get into some more details about how we implement each of these components of our evaluation plan okay so here you see our aims and process and outcome measures related to enter professional training Chinese and psychology are expected to get 300 hours of training in integrated primary care which basically amounts to one day a week for the entire training year and psychiatry and Social Work trainees should get a hundred and fifty hours they spend only one semester with us in this rotation we ask all trainees to complete training logs and then our outcome measure related to that is the number of percentage of trainees who meet that hours goal annually and this is similar for our didactic training the 15 hours that we expect in terms of how we evaluate trainee competence we have a multi method evaluation process so specifically four times a year or two times for the psychology psychiatry and Social Work training is supervisors utilize a Likert type rating scale that we've developed here at chop over several years to evaluate trainees performance relative to the end of internship expectation so basically at the time that our trainees leave here we expect that they're ready for close to independent practice or getting ready for a postdoctoral fellowship level training in this manner our ratings cannot Ellis trait growth over the course of the year so in other words we don't expect trainees to achieve the highest ratings at the first evaluation period we expect that their ratings are closer to ones or twos on all of our competencies with movement toward a four or five by the time they graduate in addition to that evaluation form supervisors meet in the middle of the fall and the middle of the spring semesters to discuss in person trainee progress in each rotation so that we get faculty supervising the integrated primary care rotation as well as the other rotations the trainees complete in person to try to identify common strengths and growth areas for each trainee and then each faculty member has an individual in-person review meeting with the training about the specific rotation and finally the trainees all get a faculty advisor who meets with the trainee to support integration their feedback and then refinement of their overall training goals that's necessary and then finally to monitor trainee caseload over the course of the year we have a dashboard that our faculty can use and generally as the program director I'm the one who monitors that dashboard and so I can keep track of trainee caseload and the number of patients that they're seeing and then I use that evaluation feedback in a continuous process along with the training logs to ensure that the trainees are meeting their experiential training goals so here you see kind of an illustration of that process that on a monthly basis I monitor their clinical logs and that patient care dashboard provide that feedback to the individual supervisors and then they can kind of take a look and make sure they're on track for meeting their end of your expectations in most cases our trainees have absolutely no problem meeting these goals our clinics are super busy so they see plenty of patients but sometimes patients show rates drop or their other scheduling challenges so we have to make adjustments to ensure the trainees are meeting their goals and then as we think about our training recruitment here's another example of how we use our zqi process and our goal at the start of the project was at at least 33% of trainees would represent racial or ethnic minority groups and although we've achieved that goal we would be really excited to be able to surpass it over the course of the year in addition this goal is specific to trainees on our integrated primary care track although our training program leadership more broadly and the internship program is really motivated to increase trainee diversity just beyond our track so we're in the process right now of engaging an external consultant to refine our plan for training recruitment as we're heading into the next recruitment season this fall and finally I want to discuss an example of iterative program development that's specifically related to curriculum revision so we have multiple didactic seminars and one of them is known as interprofessional seminar and community practice we invite trainees and all three disciplines to the meetings which occurred six times per training year the seminar usually follows the journal Club format we're in one of the trainees selects a journal article related to service delivery with underserved populations presents that paper really briefly and then lead the discussion among all of the trainees and over the course of the 2018-2019 training year that we just finished in this past June and we saw trainee attendance and engagement in that seminar really just decline and our faculty we were concerned this is a really important seminar for us and so we figured we had to really look at this carefully and think about how we were going to refine it for this coming training year so our plans for improvement included trainee feedback we did interviews and a survey we several of our program faculty conducted observations of the last couple of seminars in the training year and then our training faculty leadership or leadership faculty discussed the seminar format and that training leadership discussion included leadership from all three disciplines and so then for this current training year which I should just began last week we now have a plan to refine the seminar which is going to begin tomorrow with our orientation meeting with our new trainees and we plan to pair the trainees across disciplines and ask them to work together to leave their assigned seminar sessions so the trainees this year we hope will actually have a lot more flexibility to decide on the format so it's not necessarily just going to be journal club anymore and that again was in direct response to trainee feedback in this way we're hoping to give trainees more independence to be creative in their teaching and to encourage interprofessional work among the trainees during this coming year we're going to continue the evaluation process including faculty observation and training feedback to continue to refine that seminar is necessary okay so basically I just want to wrap it up to make sure I give dr. Fenix her time but and what it's important from our perspective what we've learned is that we really feel that it is important to include a multi method multi informant evaluation plan that includes all key stakeholders so we have faculty leaders all training faculty across disciplines and our trainee is involved in providing us with data as we go through the process we also feel that it's incredibly important to keep the trainees involved in the program evaluation process as much as possible so the RC Qi process I discussed and some of the quality improvement projects that we've done have been either led by trainees or including trainees as team members and in many cases we get trainees who have very strong research interests and they're highly motivated to help with this process and think about ways we can work on refinement and then our trainees are involved again in doing focus groups and interviews and one of the other things we've found very helpful is that we are benefiting from having a very well-developed training program broadly outside the context of this grant and relying on that existing infrastructure and resources it has been tremendously helpful to us as we've developed and tried to implement our program evaluation plan so that's all that I have to share right now we turn it over to dr. Phoenix well thank you so much to our dr. mattone for speaking with us today about her program evaluation experiences we're going to hand the floor over to dr. Phoenix now as we previously previously talked about dr. Phoenix is with UCSF School of Nursing the dr. Phoenix over to you so anyway I'd like to just tell you a little bit about our project to enhance California psychiatric mental health nurse practitioner workforce so the overall purposes of this grant are to expand clinical residency placements in rural and primary care settings because most of our of our clinical residency sites have been in specialty mental health settings up to this point and to strengthen preparation of our graduates to work in interprofessional teams and integrated primary care and behavioral health and in addition to grant grant purposes that were specific to the RFP a major focus of this grant is to reduce the significant amount of financial stress experienced living by our students living in the very expensive Bay Area so so our specific grant objectives were to administer or are to administer 18 stipends per year to students who are working with the key population specified in the RFP and to do this we're going we are increasing our admissions to 34 per year we're running about 28 to 30 before and also in recruit a larger proportion of our cohort from outside the Bay Area and then increasing our number of resident déplacement in either integrated care or rural sites by 140 R of the grant and then a third objective has been to collaborate with the pediatric primary care program in our University to do some joint coursework as well as a psychiatric consultation warm line in sub school based primary care settings where the pediatric primary care nurse practitioner program has has residency sites so there were four areas that we wanted to examine when evaluating our project success and we started by looking at what data are already being collected by our institution and how we could access them so the four areas we're looking at our students placement sites graduate outcomes than our project specific objectives so looking at the kind of student component of it all these data are readily available through existing school and campus data system so the application system the school publishes a yearly student census and then evaluate its used for tracking and evaluating students clinical experiences so not only are the evaluations of student performance and they're also the students evaluation of the clinical placement sites are collected any value and then again for the clinical placement sites all all this placement data is in a value and unfortunately it's somewhat cumbersome to retrieve the the the system has a lot of lot of functionality which makes it sometimes a little bit difficult to find the actual functions you want to use so we often you know for a kind of quick and dirty evaluation refer to the student placement worksheet that's maintained by program faculty as they are finalizing placements for a graduate outcomes we rely primarily on our own Student Survey that we that we use a Qualtrics survey for that we send out to our graduates shortly after graduation because that fits the Hertha timeframe and we would we were hoping that we would be able to get more data from the School of Nursing graduate survey but these surveys have a low response rate and they also don't ask for some of the data that's required by hersa so we found that doing our own survey has been more productive in terms of getting this information in terms of the projects civic activities these are some kind of specific activities and some of the indicators that we use so our one of our goals was to develop a placement database with better functionality than what we currently use which is basically Excel spreadsheets we have because of some changes have been going on on the school level we have not been able to make these improvements yet because we're trying to coordinate with school activities but what we really want to look at is you know once we sort of refine this system does this basically lead to less time and less headaches for the faculty and staff who are involved in the placement process and then in terms of our joint coursework with the pediatric nurse practitioner program we're primarily looking at course evaluations from students and and what they get out of this and then in the end the sort of the clinical training and clinical collaboration via the warm line we want to look at the number of trainings that are provided to primary care providers and trainees their satisfaction and how well the warm line is working this is another objective where we've had some delays because of people who are a people who are involved leaving the university but we're hoping to get it up and going in this coming year so key points you know use existing institutional data systems as much as possible to inform quality improvement processes but when the institutional systems are too slow too cumbersome or don't provide the data that you need look to develop your own data collection tools because sometimes it's just more of a hassle to use the big data systems then it's really worth so that's that's pretty much what I have to say I'm hoping for questions I'll have to get them on the chat feature because I can't hear you all unfortunately wonderful thank you so much to both you dr. Phoenix and also dr. McComb for taking the time to talk with us today about your program evaluation experiences and we've been interviewing a different be what grantees here at the BH WRC for the last year to ask them about their evaluation experiences as well as student training and your experiences dr. Mutulu and dr. Phoenix will really help other grantees who are just getting started with program evaluation and also those who are trying to adopt some new strategies so and we're going to take remaining few minutes to open up the floor for any questions that some of the attendees may have and so feel free to enter those questions through the chat feature we have a question here in our chat we have someone wondering and I will um chat this along to dr. Phoenix they're wondering if anyone can either you conducted evaluations that benefit the host organizations I do know from some of our research that we've done here at the beach WRC there are B with grantees that are actually the host host organizations themselves that are receiving Z wet funding but I guess that question to you dr. matones do you do any evaluations that do measure benefit to your to your host organizations so um that's a good question I mean I think that our focus again our training grant is awarded directly to chop and our training program exists within our institution so we kind of are the host organization the trainees rotate in our own networks so all of the evaluation that we do related to improvement in our integrated primary care service informs the healthy minds healthy Kids program more broadly and not just the training program in terms of how we're serving families we have a component of family engagement and service delivery I focused a lot on what we do to support and evaluate training and trainee involvement but we also have an evaluation component that looks at family involvement in the Sur and that you know family access to behavioral healthcare and that is really informing our pediatric primary care practices broadly so I think that that's maybe that gets at what the question is asking okay so this question is do you conduct data to evaluate benefit to your host organizations so so I definitely do that we that we collect that would be relevant to this really was mostly collected during site visits as well as student evaluations of student performance by this site we don't other than that we don't really have a formal way of doing it but I think just the fact that that organizations are willing to keep keep taking our students year after year indicates that that they benefit from having our students in their organization because if they're not getting benefit they don't take they don't continue to take student trainees thank you dr. Phoenix well that is all the time we have today for questions um so if there are any remaining insurance questions comments that you have about today's webinar and anything covered feel free to contact Jesse or I through WH WRC website and like I said a little bit make a recording of this webinar available next week and thank you so much to dr. Mitchell and dr. Phoenix for taking the time today to share all this useful information with us we really appreciate it um so without further adieu uh-huh thank you again and enjoy the rest of your day thank you for watching the webinar to view our extensive webinar library and other helpful resources please visit us at health workforce ta org

Add a Comment

Your email address will not be published. Required fields are marked *